upvote
The proverbial "50B" is investment in next year's model. The current model cost under "30B", and therefore "is profitable". It is a bet on scaling, yes, but that's been common throughout the industry (see, eg, Amazon not being profitable for many years but building infrastructure)
reply
Also see the Dario interview with Dwarkesh:

> If every year we predict exactly what the demand is going to be, we’ll be profitable every year. Because spending 50% of your compute on research, roughly, plus a gross margin that’s higher than 50% and correct demand prediction leads to profit. That’s the profitable business model that I think is kind of there, but obscured by these building ahead and prediction errors.

(a lot more at the link)

https://www.dwarkesh.com/p/dario-amodei-2?open=false#%C2%A70...

reply
Except the rumors are they subsidize even the inference, not that they have capex in training.
reply
The maths shows inference is very profitable. Look at how Google/AWS/Azure change the same rates as Anthropic does for running Claude models.
reply
You're missing the forest for the trees. Per-token pricing is irrelevant when you're just trying to get shit done. I pay 20 bucks a month for OpenAI, but I use likely $200+ a month of tokens just on the coding (and I'm just looking at the raw tokens, this is ignoring all the harnessing on their end). Even OpenAI has said that they're losing money on the 200-dollar subscriptions[1]. This is not a viable business model. Why do you think they are introducing ads this year[2]?

[1] https://fortune.com/2025/01/07/sam-altman-openai-chatgpt-pro...

[2] https://openai.com/index/testing-ads-in-chatgpt/

reply