upvote
> The issue is not so much what they do or that they exist, but how they are utilized

This is exactly how we got here though. Technology is not passive. It changes incentives, procedures, ideas and shapes the world. If we don't structurally limit what and how it's used, then we are not in control, no matter what are choices personally are.

reply
A major problem is that if we structurally limit what technologies do, we are still not in control. Now whoever we empowered to control and limit the technology is in control. Who keeps them accountable?

You’ll probably get one of three outcomes: regulatory capture by monopolies, self dealing by bureaucrats to enrich themselves or gain power, or regulatory capture by self absorbed ideologues who halt all progress or force it down some ideologically approved path.

In none of those scenarios is anything aligned with the best interest of the people.

reply
That’s what you will get in the US. It’s not clear a functioning democracy would produce the same outcome.
reply
I don’t disagree. A consumer oriented democracy is not well equipped for the challenge.
reply
I hate to tell you this but nobody has ever been in control. To think you can is to think you can unring a bell.
reply
deleted
reply
Right, and that's why we all died in a nuclear war.....
reply
The disincentives to nuclear war are glaringly obvious enough that even politicians (and their masters) get it.

AI isn't like that. One problem is that it's rather generally misunderstood at this point. "AI" is not "intelligence". It's intelligence-adjacent, and something like LLMs is part of our psyche...the subconscious facility that allows us to form sentences without really thinking about it.

At any rate, I have to agree with most of the points the blog author brings up.

reply
> I feel like our whole world more and more circulates around manipulation

Hate to break it to you but it's always been this way, and it was easier in the past when information was so much more expensive to distribute.

reply
I really didn’t experience the early internet that way.
reply
The Old Internet was a whalefall - Information online was fairly trustworthy while being more convenient and more plentiful than in-person information.

The whale's been eaten now. The broader Internet is mostly not trustworthy, or convenient, and the information is not even very plentiful.

People will and are retreating into high-trust zones. In-person networks, product recommendations from real friends, and closed group chats.

It's not the end of the world, but things have changed. We'll have to put more work into finding information than we're used to.

reply
> Time to organize around groups and collectives that we know we can trust

I’ve had the same thoughts, but if you look deeper, it all circles back to what we already had: (open, transparent) public institutions, society, and government by the people. The foundation wasn't the problem; the environment was.

Along the way, social media noise, engagement-optimisation and Kardashian-style "entertainment news" infecting real news made an attention economy where, no matter how scandalous you are, attention can be minted into dollars. That is what polluted our infosphere and lead to the lack of trust.

Now, nobody trusts these previously mentioned public entities any more - sometimes due to state-actor or ad-tech disinformation, and sometimes for good reason like when the poisoned public allowed these 80s-style telemarketer-style political weirdos and their cronies to take over public administration.

reply
[dead]
reply
> Right now, they are utilized to further the class divide between rich and poor.

Ironically this was the main reason LLMs were introduced in the first place, not to benefit the poor, but to widen the gap between the rich and the poor.

reply
The majority of human history has been written by the ruling class of the day. Transparency only seems to follow in the wake of their inevitable fall, usually at great cost in retrospective research via the oft thankless unraveling of threads of truth from their more copious fictions. Much like the machines we construct in our likeness, we too seem to get stuck in endless regressive cycles.

Folks in the "now" have always had a tendency to cling to their fictions as if they were truth for whatever reason; like nationalist exceptionalism, racial superiority, or religions rooted in "othering", etc. Humans seem to have an innate desire to fool themselves and trust in things they should not. Perhaps it's simply a sort of existential coping mechanism of living in a cold, unforgiving reality. We seek the comfort of lies.

Organizing around groups of trust, tends to lead to factionalism and conflicts. Knowing and trusting are sadly very different things in our species.

reply
Local models and powerful consumer HW and an informed populace that doesn't hate STEM, but that's not good for the shareholder value so you get expensive everything everywhere all at once instead. And if you dare question the mindset of hating on STEM whilst being addicted to its fruits, that just means you're another one of those maximally SV-aligned sociopaths so why bother? Evolve and let the chips fall where they may because I don't see any other options that play out in the idiocracy craving for strong confidently wrong leadership.
reply
Our society, pre internet, built systems to manage trust. The conditions that allowed those systems to exist (the speed of transmission of data, the ratio of content generation to verification, the ability to shape consensus), have changed.

You are ringing the clarion call for community and cooperation, and it will not work. Not because people don’t want community or the better things, but because incentives make the world go round.

The choice between making some money at the cost of polluting the information commons is no choice at all. That degradation of the commons means no one can escape. No community you form, no group you build, dodges the fallout when someone decides to set fire to shared infrastructure.

We are moving into the dark forest era of the information economy. As models improve, inference costs drop, and capacity increases, the primary organism creating content online will be the bot.

Instead of building communities of people, build collections based on rules of engagement. Participants - be it bots or humans - must follow proscribed rules of conflict and debate.

That way it doesn’t matter if you are talking to a machine or a person. All that matters is that the rules were followed.

reply
Self inflating nipple shaped balloons that generate their own lift without any helium would be an incredible achievement but that doesn't mean it's useful beyond being novel. Chatbots are ultimately just predictive text on steroids, and only complete fools would base their business, or entire economy around it.
reply
[dead]
reply