upvote
I’m pretty sure they’d need a small data center to run a model the size of Opus.
reply
Even an O3 quality model at that speed would be incredible for a great many tasks. Not everything needs to be claude code. Imagine Apple fine tuning a mid tier reasoning model on personal assistant/MacOs/IOS sorts of tasks and burning a chip onto the mac studio motherboard. Could you run claude code on it? Probably not, would it be 1000x better than Siri? absolutely.
reply
Yeah, waiting for Apple to cut a die that can do excellent local AI.
reply