upvote
It's not a very good small model to be honest.

That said, you might be surprised to learn that some of the models from 3b-9b could probably replace 80% of the things nonvibe coders use chatgpt for.

Its a good idea to run small models locally if your computer can host them for privacy and cash saving reasons. But how can you trust Google to autoinstall one on your machine in 2026? I just couldn't do it.

reply
Sure, local models good and yes, there's no way we can trust Google.

We can be positive the entire motivation of Chrome is user behavior surveillance. There's not a nano-chance in all the multiverses that Chrome model is doing anything privately. They've gone to extraordinary length to accomplish this. It's not for free.

reply
It is entirely about user surveillance as well as pushing their product on to their users because they have the install base. Google Chrome has become Microsoft IE6 in hostile user behavior.
reply
You either die a hero or live long enough to see yourself become a villain.

What did we expect when they dropped "don't be evil" from their company values?

reply
A claim about as useful then as it is now. They never wanted to be anything but, once Sergei left. The Schmidt era had them publicly declare one thing while doing something else entirely behind the curtain.
reply
I don't trust them either, but the same Google makes Gemma 4 available to run as locally and privately as you want, and those models are pretty amazing for their size.
reply
Half of the reason to use local AI is to circumvent the censorship that Google, OpenAI and so on have. I don't want this Google crap on my computer.
reply
Which is why I uninstalled Chrome a (short...) while ago and my life went on unbothered.
reply
I ran a fairly large production test of this and on _every_ measure except for privacy it was worse than a free tier server hosted LLM.

Not happy about that as I would like to see more local models but that's the current state of things.

https://sendcheckit.com/blog/ai-powered-subject-line-alterna...

reply
> on _every_ measure except for privacy it was worse than a free tier server hosted LLM

Would you be able to compare this to other local models in it's class and a above that would fit consumer-grade hardware?

reply
It's based on Gemma 3n, and it's not the best.

I find it works fine for simple classification, translation, interpretation of images & audio. It can write longer prose, but it's pretty bad.

It can also write text in the format of a JSON schema or regexp for anything you might want to do with structured data.

reply
I wonder why they’re using Gemma 3 and not Gemma 4?
reply
Google has been trialling the Prompt API in chrome for the over a year, so before Gemma 4 existed. But they are indicating they'll move to Gemma 4: https://groups.google.com/a/chromium.org/g/blink-dev/c/iR6R7...
reply
So that the big news in non-tech news sites will be the update. Thus ensuring that this is received in a positive light.
reply
It'll probably update to that without telling you at some point.
reply