upvote
Tokens cheaper? I don't think that seems to be the case ... VC funded tokens were there to build user base and token price will go up as they eventually switch from growth to profitability.
reply
I wish I could place a lot of money on the opposite side of this bet.

I don't think many realize how could the cheap, alternative models are becoming. I prefer SOTA models for key work, but I can also spend 10X as many tokens on an open model hosted by a non-VC subsidized provider (who is selling at a profit) for tasks that can tolerate slightly less quality.

The situation is only getting better as models improve and data centers get built out.

reply
What open source model and what non-subsidized provider specifically?
reply
GLM 4.7 Flash is 0.07/1m tokens in, 0.40/1m tokens out on AWS Bedrock us-east-1. That's less than 1/10 the price of Haiku 4.5

Bedrock isn't the cheapest either although I'm fairly sure they aren't being VC subsidized

There are definitely cheap tokens out there. The big gotcha is "for tasks that can tolerate slightly less quality"

reply
Yes, but how cheap is it to run four at the same time? It’s tough to run one good model locally, but running four at the same time which I commonly do with Claude and Codex just doesn’t seem to be happening anytime soon.
reply
I'm referring to hosted models such as via OpenRouter or from the model providers' own services.

I think everyone making claims that inference is getting more expensive are unaware that there are more LLM providers than Google, Anthropic, and OpenAI.

reply
Fair - there are bets both ways though I wouldn't consider it to be a certainty. That revenue drive on this AI build out is going to be real and multifold.
reply
It will take a few years until scheduled data center construction finishes, and together with software optimizations that may come up in the meantime, it may cause a significant decrease in token price.
reply
And the lethal trifecta but I suppose that's all agents as of now anyhow. Every AI provider has major warnings about letting AI have access to PII in the browser.
reply
nobody can block actual LLM providers, they use spoofed requests to scan web for content, sometimes even using residential proxies.
reply
Sure they can, proof of work seems to be effective. Anubis has become pretty popular
reply
They don’t need to be 100% effective they just need to make you afraid enough of being banned to not bother trying.
reply
How do they know that the "you" accessing the site is the same "you" they previously banned?

Face-scanning? Iris patterns?

reply
You used your credit card to buy whatever service or product they sell.
reply
I hate to break it to you but it is really easy to get anonymous visa/mastercard cards.
reply
> the thing preventing mass-adoption seems to be the number of tokens it takes.

Try the exhorbitant expenses and ballooning waste of generated electricity and usable water.

reply