upvote
I hallucinated nothing, and my point still stands.

You actually described a bug in software by ascribing intentionality to a LLM. That you "hedged" the language by saying that "it behaved as if it wanted" does little to change the fact that this is not how people normally describe a bug.

But when it comes to LLMs there's this pervasive anthropomorphic language used to make it sound more sentient than it actually is.

Ridiculous talking points implying that I am angry is just regular deflection. Normally people do that when they don't like criticism.

Feel free to have the last word. You can keep talking about LLMs as if they are sentient if you want, I already pointed the bullshit and stressed the point enough.

reply
If you believe that, you either have not reread my original comment, or are repeatedly misreading it. I never said what you claim I said.

I never ascribed intentionality to an LLM. This was something you hallucinated.

reply