upvote
> i presume they wont let you “manage all your AI spend in one place” for free.

Of course they will. In return they get to control who they’re routing requests to. I wouldn’t be surprised if this turns I to the LLM equivalent of “paying for order flow”.

reply
i got shivers thinking about a future ai dynamic pricing and automatic gateway choosing the cheapest provider available
reply
Openrouter already does this, unless I've misunderstood the premise.
reply
They can route between models but you pay the standard rate for whichever model is selected (plus 5% fee). Afaik all current model providers have fixed prices per tokens which don't vary depending on, say, demand or hardware availability.
reply
shivers? as in it frightens you? i believe there is no way around tokens being prices like gasoline at the gas station - it changes every hour. Any other system means you are either over- or underspending.
reply