upvote
The gambling trope is so tired. AI development doesn't involve luck to any appreciable degree, certainly not more than hiring people to do a job can be considered "gambling" (you never know what you're going to get!).

It's just paying to get stuff done, which is how it's always been, since the dawn of man.

reply
>AI development doesn't involve luck to any appreciable degree

Reading this while I'm prompting for the third time to fix a 100+ line function is amusing, to say the least. I don't care about the definition of "appreciable", but I definitely have to repeat myself to get stuff done, sometimes even to undo things I never told it to touch.

reply
For most people who are not doing their day to day jobs it's just a prompt of their idea roughly sketched out and a miracle happens - LLM fills in the blanks. Every time it's different but it works, sometimes even better than initially expected. That's why the addiction and gambling. Gambling is a lot of things, not only flashing lights or play sounds. Some people claim prediction markets isn't gambling either, though that doesn't change the fact.
reply
How is this different from hiring a designer, telling them "make me a website" and then waiting to see if they resolve the uncertainty into something you like or not?

I tell LLMs what to do in pretty high detail, and they do it. With LLMs I have much less variance than with coworkers.

reply
It is different because for humans, it takes time to produce some result, while AI does it instantly. So if you tell a programmer to do X, you have a week for your adrenaline to cool off. If you tell AI, it will do it in minutes.
reply
I don't think the difference between a designer and a slot machine is that one gives you results more slowly, "therefore it's not gambling".

If you're making the argument that LLMs are gambling simply because they're faster than humans, I'd like to see some evidence.

reply
> If you're making the argument that LLMs are gambling simply because they're faster than humans

No I am not. It's more addictive because of the timescale. The comparison of AIs to gambling is through addiction mechanism, as I explain elsewhere.

My aunt used to put in (the same) lottery numbers every week. It was gambling, but probably not an addiction in the clinical sense. If she had played slot machines, god forbid, it could have been more problematic. AI is a slot machine, a hire is a lottery ticket.

reply
I don’t like the gambling comparison either. It’s more like smoking or drinking. It’s an addiction you lean on to help you do something- even if that something is just getting through the day.
reply
Like the internet!
reply
Yeah but those are classified as addictions because they have a harm component (lung cancer, liver disease, societal impact). LLMs aren't going to kill you. If anything, it might be like gaming addiction.

If you've gotten to the point where you'd rather talk to an LLM than socialise, go to work, etc, then yes, you definitely have a problem, same as with a gaming addiction.

Saying "LLMs are slot machines" is like saying "video games are slot machines", and nobody says that, even though it's more true of video games (some are actual slot machines/gacha) than of LLMs.

reply
I'd observe that there are professional gamblers, and there are amateur gamblers.

If you know what you're doing, know how to spec a problem space, and can manage the tool competently enough to churn out good results, then everything's fine, and you're maybe being productive or increasing your productivity by some degree. (Professional "Gambler")

If you DON'T know what you're doing, and you're just vibe-coding, then I would argue that it is at least a form of gambling (Amateur "Gambler")

Both of these conditions can also be applied to "hiring people to do a job" however there we can also observe things like reputation, credentials and so on.

"It's just paying to get stuff done..." is, with respect, superflous.

reply
I don't know, I can understand "some people might overdo it and get addicted to LLMs". I can't understand "LLMs are slot machines and that's all they're good for" when I use LLMs every day to do tons of actual work.
reply
> certainly not more than hiring people to do a job can be considered "gambling"

Actually it's quite possible that being a business manager/owner is actually addictive (having power over people), we just don't recognize it as such.

reply
All gambling addiction is addiction, not all addiction is gambling.
reply
Then you miss the point - AI use is being compared to gambling because it is addictive, partly due to same mechanism - the results (and rewards) are somewhat random, but it makes you feel as if you're completely in control of the outcome.
reply
Yeah, that hasn't been my experience. The outcome, for me, is extremely consistent. I ~never have to "reroll" by wiping work and doing it again.
reply
Strange. I tell Claude Code to do things differently all the time.
reply
I'd recommend a different workflow, with extensive upfront planning. This works extremely well for me:

https://www.stavros.io/posts/how-i-write-software-with-llms/

It's to the point that I just push the output of that to production and know it'll be OK, except for very large changes where I'm unlikely to have specified everything at the required level of detail. Even then, things won't so much be wrong, as they'll just not be how I want them.

reply
The gambling part is because of the (hopefully emergent and not purposefully designed) intermittent reinforcement due to the limits. You don't get that with regular hires.
reply
Really? All the hires I've seen had an 8-hour/5-day limit, or you had to pay through the nose for extended usage outside that window.

Where do you get your 24/7 hires from?

reply
You usually don't get immediate responses from hires which means delayed gratification and avoiding much of the potential dopaminergic effects you get when engaging with LLMs.

You can play overextending the hire analogy all you want but it is simply not the same.

reply