Hacker News
new
past
comments
ask
show
jobs
points
by
rickdg
1 days ago
|
comments
by
simonw
1 days ago
|
next
[-]
So much better. Hard to quantify, but even the small Gemma 4 models have that feels-like-ChatGPT magic that Apple's models are lacking.
reply
by
snarkyturtle
1 days ago
|
prev
|
[-]
AFM had a 4096 token context window and this can be configured to have a 32k+ token context window, for one.
reply