It's not an anomalous sense of cynicism, hundreds of thousands of people are looking at their options and feeling hopeless. I'm glad I am not in that camp. The reason I'm not is because I was born sooner than they were. I don't blame them at all, it's looking a lot like the generation after them is cannon fodder if things trend the way they are now.
I would tell them this is the problem to fix. Taking your anger out on AI is the most shortsighted thing. When faced with a powerful new capability, disavowing the capability instead of enabling society to leverage it is absurd.
AI is fundamentally the automation of labor, and we can all see the incredible fruits we all reap from similar past leaps in capability.
Structure your society for a post-labor world. Don't halt the progress that has dramatically improved the human condition. To do so is a disservice to the species and all future humans - concretely, your own loved ones and especially your children.
You clearly accept this as Progress, but isn't the core debate here that it doesn't improve life for humans?
Does literally no one look at things from a historical perspective? The history of automation is right there on the Internet, for you to peruse at will.
UBI also won't fix things. A post ai world that the us tech ceos want us to imagine is not a utopia. The us manufacturers almost nothing on the world scale. Our biggest contributors to the world economy were things like farm goods(which are in peril), fuel (which most countries are trying to phase out for environmental and recent geopolitical issues), software which will be commoditized through AI. Anything the us can manufacture China can do better, cheaper, and faster. It's not been in our culture for decades, and our infrastructure is shoddy.and will be shoddier once data centers spin up and more wealth is concentrated to people who do not pay any taxes.
GenZ and those coming after have no chance at a sustainable life if the billionaires get what they are asking for. Also in a capitalist society asking them to sacrifice their lives for the good of others is hilarious. Especially if there is no foreseeable good to come after.
Of course no one sees it as a collective achievement when the announcements are aimed at either scaring people about how even the team behind them is worried about releasing it or for CEOs to replace workers.
Artemis II, at least in the states, was an example of people genuinely feeling collective achievement. There is absolutely no reason this AI moment couldn't be that. Instead though the companies involved have explicitly chosen fear and capital as their marketing tools. We should be seeing this as an incredible time but those involved do not want us to and plan to keep the spoils for themselves so we shouldn't.
> But instead we're seeing them explicitly marketed as tools for capital centralization.
And labor automation, which is the single most valuable thing any technology can do. But if your answer is "kill the technology" instead of "structure society to live with it," of course you will experience pain.
"AI" is an achievement alright (so was designing a nuclear bomb), but if it is allowed to further gut the middle class, lowering wages, and hence spending (and tax receipts, to extent that matters any more) then it will only hasten the spiraling of the US economy down the toilet.
It is a completely coherent position to like most technological progress, but at the same time be critical of some uses of ML/AI.
You are just making straw men here by suggesting that people that are critical of AI are critical of all technology.
Well, yes, but if humans need to stay in the loop (as most previous automations of labor), it is also moving the means of production into the hands of a small number of tech companies. In 2010 or 2020, anyone with a laptop could create a startup. It might be the case that in 2030, you could only do so if the major frontier model providers allow you to do so and do not make it so expensive that it's only usable by entrenched players.
I am not fundamentally against AI, on the contrary, but I think the models should be in the hands of the wider population (i.e. open weight models), so that everyone has the means of production and can benefit from the automation. Also, it would only be fair, since the models are trained on the collective output of humanity. Of course, there are several barriers currently. There are pretty good open models, but running the near-frontier versions requires a lot of capital in the form of GPUs.