Just because he's operating in the realm of smart nerds doesn't mean he is immune to the value-inverting effects of social media.
Or those of hype, e.g. AI hype.
I imagine it doesn't run very cheaply.
But LLMs are trying to mimic people. So if confusion is the human response, what's to stop the llm from acting confused?