upvote
We're in the middle of an active cold war where countries are trying to manipulate the citizens of rival countries to destroy their civilization without having to fire a single bullet. Anonymous, over the internet mass manipulation, all for some minimal electricity cost.
reply
That's definitely the most insidious use, but I think the larger portion is advertisers and karma farmers (who later sell to advertisers).
reply
https://www.npr.org/2024/09/05/nx-s1-5100829/russia-election...

If Russia is willing to spend cash like that, then of course they're willing to run massive bot farms to pollute any forums they can. I'd be shocked if the US was not doing the same in any way they can. You have to ask why Trump killed Radio Free America as well when it was clearly not an big expense.

reply
> Trump killed Radio Free America as well

Not sure how this relates to the subject in a direct way. Radio Free America was a outlet explicitly created and utilized to spread US propaganda, but kinda sorta barely disguised as a journalistic enterprise (not really, if you were listening to RFA you knew what you were listening to.) Shutting it down seems to be a counterpoint to all of the covert participation of US intelligence on the web which has done nothing but escalate.

reply
It was a head scratching decision that few believe was for the stated reason. Other countries are ramping up their propaganda arms while Trump shut down part of the US'. The reasoning was cost, but that doesn't make a lot of sense in the grand scheme of things. Foil hat types would easily believe it was the puppet doing the bidding of the one that pulls the strings. RFA has been a thorn in despots' side for a long time.
reply
> You have to ask why Trump killed Radio Free America as well when it was clearly not an big expense.

The obvious answer to that question is "because he's a Russian asset". But that doesn't mean the obvious answer is also the correct one.

IMHO, we're seeing another and much more concerning trend at play here... the utter and complete rejection of anything but violence by the far-right. Diplomacy? Development aid? Cultural exchange? All sorts of soft power have been under attack for decades now, and not just by the far-right but (especially when it comes to development aid) also by mainstream centrist parties across the Western world. And it's always pseudo-masculine / "strongman" BS backing the sentiment - Bernd Höcke, German AfD mastermind, comes to my mind with "we have to rediscover our masculinity" [1], so do Hungary's Viktor Orban and his denouncement of LGBT or Trump's entire Œuvre.

I'm not saying that violence or at least being prepared, ready and willing to use it is automatically bad. Far from it. But all the various forms of "soft power"? They have a lot of value, value that the far-right is all too willing to just burn for entertainment.

[1] https://blogs.taz.de/zeitlupe/2019/03/24/die-auferstehung-de...

reply
wouldnt it be more productive to talk about the systemic framework leading to this inflamed state of affairs, and ways that we can tackle the issue on the ground level? perhaps inhabitants of the west would prefer pseudo masculinity to another few decades of migrant influx without corresponding upgrades to social infrastructure. this sort of internal struggle would provide a ripe substrate for foreign agents to perform subterfuge, especially in a screen based world where the narrative can be remotely influenced. conclusively, the population has been convinced that voting far right is the correct decision in their favor, but the question remains, who is it really in favor of? call me a centrist all you like but members of my family were executed under communist regimes so i find it pointless focusing on one side of yin/yang here (in other words, extremists are violent regardless as to their political persuasions).
reply
> in other words, extremists are violent regardless as to their political persuasions

No matter where you look, the far-right kills and maims substantially much more people than the far-left does.

reply
AI is particularly bad at this, and regimes that employ tactics generally are not short of labour to have humans to do it.

If AI is being used in these areas it is less as an attempt to manipulate as it is to just create noise and engender distrust in what they hear.

reply
Established accounts are worth money, often for scamming/propaganda.

Not too dissimilar to people bot-leveling in MMOs to the sell the accounts.

reply
It's very common for folks to search Reddit to find reviews of products etc. these days. If you can have a bot account post a fake review of how awesome your product us, and have that upvoted, it can pay huge dividends.
reply
I've noticed 4 categories of inauthentic users. Ranked by my perceived prevalence:

Account farmers: these can be people in 3rd world countries automated/not automated. Can be using hundreds of mobile phones to create accounts and do daily activity to make the account look legitimate. While they're building an activity history they are also being paid to like/follow/interact with content.

Advertisers: these are brought accounts that are used to pose inauthentic reviews of their service and inject it into discussion and to do PR

Sloppers: people who build AI pipelines and then just pump the most dogshit content directly into a platform trying to make any amount of money.

Nation State propaganda arms: These accounts build a narrative character and then join discussion pushing a certain narrative, boost real content creators who share their message and bog down discussion.

reply
People like the above poster who are "just running an experiment" or "trying something for fun" who then wonder why online communities are full of AI now.
reply
In the case of Reddit and HN a lot of it is done by businesses either blatantly advertising themselves or building up the karma they need to effectively do so. I recall reading obviously AI generated replies to news articles written by accounts associated with businesses related to the events in the news. This isn't new in the LLM era. Hobby subreddits are well known to be always full of businesses selling hobby gears and items doing self promotion. It's just that now it is a lot more obvious because of the AI text smell.

That, and probably political astroturfing. Before every election my local subreddit sees a surge of crime stories. Go figure.

reply
I think some of it is account farming, but some is just people buying wholesale into the idea that if you're not using AI for everything, you're gonna be left behind. On the Kagi Small Web list, there's plenty of hobby blogs that used to be normal pre-2023 and are now obviously LLM-written and AI-illustrated. There's also plenty of people on LinkedIn who post AI slop because they think it helps them build a "professional brand". I even have some distant friends who are using AI for responding to friend & family posts on Facebook just because it makes you seem... smart? engaged? I don't know.

It's actively encouraged by some of the platforms too. In Gmail and Google Docs, you have incessant AI prompts along the lines of "help me write this". I think LinkedIn does the same.

reply
HN has historically been gamed for visibility. The stakes for doing this can be quite high if you can pull it off.
reply
Lots of marketing. Not even AI business, just regular consumer crap. They realized that blatantly spamming their product looks bad, so they orchestrate multiple accounts to look more organic. And people actually engage with it.
reply
My impression is that they're sometimes unemployed people or students hoping to create a popular open source project, and use it to find a job.

They aren't going to care about any of the advice in the article about not posting slop -- finding a job is (of course?) more important to them.

Can't really say they are doing anything wrong, maybe I too would have? ... Just that large scale, doesn't work

reply
There are many reasons for influence campaigns, that isn't new. Influencing the public is incredible valuable; that's why so many invest so much in it. LLMs automate it like never before.

Plain advertising, governments' propaganda, political propaganda for one group or another to shift public opinion (it's done on TV networks, why would they not do online campaigns?), astroturfing by corporations promoting acceptance or fighting negative news (e.g. rideshare, AI, whatever certain wealthy personalities are doing) ... the list goes on.

HN has always been relatively influential in the tech industry and therefore worth influencing, and now the cost is very cheap - you don't even need to hire many people, so less-resourced operators will find it worthwhile (and they will also attack lower-value forums).

reply
If you farm a fleet of good accounts, you control the discourse. On HN, you could boost whatever you're trying to push, and downvote or flagkill whoever objects.

There are obvious benefits to controlling public discourse, right? Even if it's just to support some project you're working on.

reply
There are certain topics that seem to get instantly flag-killed unusually often. IPv6 is one.
reply
I've seen a lot of ipv6 wars here without flagkilling happening
reply
I've been more disturbed by comments that were flagkilled just for being wrongthink, not because they were rude or not well argued. I've also seen a lot less of those flagkills over the last 6 months, which makes me feel like there were some fake accounts that got caught and culled.
reply
In the recent thread about life in a class war, a lot of comments in different places saying that if we don't fix this inequality problem, g-tines might come back, and every single one of them was flagkilled, no matter whether it framed as "we have to get out the g-tines" or "we have to fix this, otherwise psychopaths will get out the g-tines" or "thank god we've become civilized enough that we don't get the g-tines out"
reply