I get the Anthropic models to screw up consistently. Change the prefix. Say in the preamble that you are going after supper or something. Change the scenario eveey time. They are caching something across requests. Once you correct it, it fixes its response until you mess with the prompt again
reply