upvote
When OP wrote about LLMs "thinking" he implied that they have an internal conceptual self-reflecting state. Which they don't, they *are* merely next token predicting statistical machines.
reply
This was true in 2023.
reply
And it still is today.
reply