upvote
If this claim is true (inference is priced below cost), it makes little sense that there are tens of small inference providers on OpenRouter. Where are they getting their investor money? Is the bubble that big?

Incidentally, the hardware they run is known as well. The claim should be easy to check.

reply
To be clear, I'm talking about subscription pricing. API pricing for Anthropic is probably at-cost.

I dare you to run CC on API pricing and see how much your usage actually costs.

(We did this internally at work, that's where my "few orders of magnitude" comment above comes from)

reply
It's an option and they are going to do it. Chinese models will be banned and the labs will happily go dollar for dollar in plan price increases. $20 plans won't go away, but usage limits and model access will drive people to $40-$60-$80 plans.

At cell phone plan adoption levels, and cell phone plan costs, the labs are looking at 5-10yr ROI.

reply