upvote
I'm curious how to best define what AI psychosis actually is.

My understanding is that regular psychosis involves someone taking bits and pieces of facts or real world events and chaining them into a logical order or interpolating meanings or explanations which feel real and obvious to the patient but are not sufficiently backed by evidence and thus not in line with our widely accepted understanding of reality.

AI psychosis is then this same phenomenon occurring at a more widespread scale due to the next-word-prediction nature of LLMs facilitating this by lowering the activation energy for this to happen. LLMs are excellent at taking any idea, question, theory and spinning a linear and plausibly coherent line of conversation from it.

reply
How do you have so many crazy friends?
reply
> friends I’ve had who have done things like have a full on mourning session when a model updates because they lost a friend/lover

I mean, isn't that the natural and expected response? An AI company sold them a relationship with a chatbot and at least some their social/romantic needs were being met by that product. When what they were paying for was taken from them and changed without warning into something that no longer filled that void in their life why wouldn't they morn that loss?

The fact that they were hurt by that sudden loss is totally healthy. It's just part of moving on. The real problem was getting into an unhealthy relationship with a fictitious partner under the control of an abusive company willing to exploit their loneliness in exchange for money.

Hopefully they now know better, but people (especially desperate ones) make poor choices all the time to get what's missing in their lives or to distract themselves from it.

reply
> I mean, isn't that the natural and expected response? An AI company sold them a relationship with a chatbot and at least some their social/romantic needs were being met by that product. When what they were paying for was taken from them and changed without warning into something that no longer filled that void in their life why wouldn't they morn the loss of that?

Ah, I forgot about the ai relationship companies. No this guy was using the browser based ChatGPT for coding and ended up in love with the model. No relationship was sold at all.

reply
Wow, okay. Reading a whole relationship into that sort of interaction is way less reasonable, although now that I think about it a somewhat similar thing happened to Geordi La Forge once...
reply