upvote
I think AI literally makes even being wrong feel like getting something done. And that is the addictive part for people.
reply
Look at all this text I have! It can't be worthless right?!
reply
"Near-Miss" effect: https://harprehab.com/blogs/the-psychology-of-risk-why-gambl...

I believe that's the strongest pattern in LLM gambling. Was listening the Syntax and they described that "Even though theLLM did it wrong 4 times, that 5th time could be right, so why not just go!"; paraphrased of course.

It also explains the meta-LLM business, where all these CEO types put in some question and because the LLM just knows all these words, they believe it's valuable because it's "almost" correct, even when that last correction might be forever elusive because these machines arn't thinking, they're patterning a highly regularized language beneath the more loose descriptions.

There'll definitely be a winner in the AI bubble, but it'll be seen after it pops.

reply
[dead]
reply
Having used agents some I think 'addictive behavior' is really the closest thing to the feeling it gives me as well. I don't find it engaging my critical thinking brain, and in fact it often subverts that in favor of 'get the next dopamine hit faster' behavior (ie just rerun it, leading to the metaphor the OP is using). It takes a conscious effort for me to get back out of that cycle and start thinking of the fine details of what the code really does, or why I wanted it to do that in the first place. I have called it 'smoking vibes' and 'chasing rAInbows' in my sillier moments. It really does feel good... too good :P
reply