upvote
You can't "patch" LLM's in 4 hours and this is not the kind of question to trigger a web search
reply
This has been viral on Tiktok far at least one week. Not really 4 hours.
reply
You can pattern match on the prompt (input) then (a) stuff the context with helpful hints to the LLM e.g. "Remember that a car is too heavy for a person to carry" or (b) upgrade to "thinking".
reply
Yes, I’m sure that’s what engineers at Google are doing all day. That, and maintaining the moon landing conspiracy.
reply
If they aren't, they should be (for more effective fraud). Devoting a few of their 200,000 employees to make criticisms of LLMs look wrong seems like an effective use of marketing budget.
reply
deleted
reply
A tiny bit of fine-tuning would take minutes...
reply
You absolutely can, either through the system prompt or by hardcoding overrides in the backend before it even hits the LLM, and I can guarantee that companies like Google are doing both
reply