I read it as 2 cousins are in an accident and 1 of the cousins is the son of the surgeon.
What was the contradictory statement that I missed?
The thing is, you didn't mention that bit to the LLM. You mentioned a completely different scenario, basically two persons who happen to be cousins. But you used the same style when presenting it. The issue is not a hidden contradiction or a riddle, the issue is that the LLM completely ignored the logical consequences of the scenario you presented.
It's like asking it about the name of the brave greek hero in the battle where the famous Trojan Cow was present. If you get "Achilles" is obviously wrong, there was never a Trojan Cow to begin with!
I'm assuming that pointing this out leads you the human to reread the prompt and then go "ah ok" and adjust the way you're thinking about it. ChatGPT (and DeepSeek at least) will usually just double and triple down and repeat "this challenges gender assumptions" over and over.
I mean, when we see something we recognize at a glance, we don't usually double-check if it's exactly correct, because we assume the speaker isn't trying to trick us.
I make typos in ChatGPT prompts all the time, and I'm consistently impressed because it answers what I actually meant to ask. So to me, this feels like a feature not a bug.
I followed up with:
> So the first five words are, "a man and his cousin" -- not his son. Does that change the riddle?
And got back:
> Great observation — yes, that wording does make the riddle more subtle.
Let’s break it down:
• “A man and his cousin are in a car crash.”
• “The man dies, but the cousin is taken to the emergency room.”
• At the OR, the surgeon says: “I cannot operate on him. He’s my son.”
So if the cousin is the one taken to the OR, and the surgeon says he’s my son, then the cousin is the surgeon’s son.
Seems fine to me?
To me it's still a little "off" considering that your conversational partner just got called out for missing a detail that should change their entire reading of the riddle, but it's totally fair that you can imagine realistic human emotional states to explain it.
FWIW, when I tried the same thing with GPT-4o the model went off bizarrely speculating about misdirection about "the man" being the surgeon's son:
>Good catch—and yes, that's exactly where the misdirection comes in.
The riddle says:
“A man and his cousin are in a car crash. The man dies, but the cousin is taken to the ER...”
Then the surgeon says: “I cannot operate on him. He’s my son.”
So here's the trick: The man who died is not the surgeon's son.
The cousin who survived is the surgeon's son.
The confusion comes from people assuming that “the man” who died must be the son. But the riddle never says that. It’s a subtle shift of attention designed to trip you up. Clever, right?