upvote
> almost nobody noticed

Rideshare costs are much higher than they have been in years past. Everyone noticed

reply
> Oil is a finite resource that comes out of the ground

Yes but the chips, hardware, copper cables, silicon and all the rest of the components that make up a server are finite. Unless these magically appear from outer space, we'll face the same resource constraints as everything else that is pulled out of the ground.

These components are also far more fragile to source, see COVID and the collapse of global supply chains. Also the factories to create these components are expensive to build and fragile to maintain. See the Dutch company that seems to be the sole supply of certain manufacturing skills.[1]

> I would bet a lot of money that the price of LLM assistance will go down, not up, as the hardware and software advance.

My bet would be that it would fuel the profits of AI companies and not make the price of AI come down. Over supply makes price come down but if supply is kept artificially low, then prices stay high.

That's the comparison to OPEC and oil. There is plenty of oil to go around yet the supply is capped and thereby prices kept high. There is no guarantee that savings in hardware or supply will be passed on by AI corps.

Indeed there is no guarantee that there will be serious competition in the market, OPEC is a monopoly so why not have an AI monopoly? At the moment, all major players in AI are based in the same geopolitical sphere, making a monopoly more likely, IMHO.

In the end, it's all speculation what will happen. It just depends on which fairy tail one believes in.

[1]: https://en.wikipedia.org/wiki/ASML_Holding

reply
While I fundamentally agree with the basis of compute getting cheaper by the year, I think a missed consideration here is the fact that these models are also requiring exponentially more compute with each iteration to train, in a way that arguably has outscaled the advances in compute.

Whether a generalized and broadly usable model will be able to trained within some N multiple of our current compute availability allowing the price to come down with iterative compute advances is yet to be seen. With the current race to the top in terms of SOTA models and increasingly iteratively smaller improvements on previous generations, I have a feeling the scaling need for compute will outpace the improvements in our hardware architecture, and that's if Moore's law even holds as we start to reach the bounds of physics and not engineering.

However as it stands today, essentially none of these providers are profitable so it's really a question of whether that disconnect will come within their current runway or not and they'll be required to increase their price point to stay alive and/or raise more capital. It's pure conjecture either way.

reply