upvote
Given how all of Big Tech (except Google obviously) is going all in on Claude Code, I wouldn't be surprised if Anthropic becomes profitable first.
reply
Anthropic doesn't have anything else other than the Claude models.

But notice that no-one, not a single mention of Deepseek tells me that they are preparing to scare everyone again. Which is why Dario continues to scare-monger on local models.

Sometimes you do not need hundreds of billions of dollars for inference when it can be done locally with efficient software; and Google proved that. But where is the money in that? So continues the flawed belief in infinitely buying GPUs to scale which Nvidia needs you to do.

Only a matter of time for local models to reach Opus level. We are 1 or at most 2 years behind that and Anthropic knows that.

reply
> Only a matter of time for local models to reach Opus level. We are 1 or at most 2 years behind that and Anthropic knows that.

Can confirm. Kimi K2.5 is pretty intelligent and most of the time there's no difference between Opus and Kimi.

reply
Local models just make no economic sense since the GPU will idle 99% of the time.
reply
You have a GPU already (at least an iGPU and an NPU on most newer platforms) as part of your computer, might as well get some use out of it with local inference. And trying to do inference on a larger model with an undersized GPU will have you idling a lot less than 99% - but that still makes a lot of sense for most casual users who will only rarely need a genuine "Pro" class answer from AI. Doing that locally is way less hassle than paying for a subscription or messing with API spend.
reply
False on a team that’s distributed
reply
[dead]
reply