They com like that from factory. Hardcoded to never say no.
Non?? Only those with sh*tty code, surely.
There's nothing inherently non-deterministic about inference.
It's not a guaranteed way to control their behavior, but you can more than move the needle.
Steering an LLM with a prompt is way less reliable than steering a car with a steering wheel, but there's still control. It's just not absolute.