upvote
lol I once used a similar “you’re a machine so just do as you’re told” to a prompt and it answered back: “I’m not a machine, I’m Claude a helpful assistant” and refused to do what I asked because it claimed I didn’t have the authority to make the decision I’d asked it to convey in writing.
reply
reply
> Don't try and add personality or humour; remember you're a robot."

> remember you're a robot."

The anthropomorphization juxtaposed to the actual command is a bit ironic.

reply
It really does make you wonder why all the models seem to require that. In principle, it shouldn't be a property of LLMs, and lol no it's not an "emergent property".
reply
Post-training and "human preference" according to "data". Don't know a single developer who use these tools for work who prefer that though, but also don't know anyone who use LLMs a lot just "for fun" either, might just be vastly different preferences between the two userbases.
reply
I'd add "no ass-kissing"
reply
I like it. Have you tried putting this in your LLM system prompt?
reply
need prompt macros
reply