upvote
Also us weirdos with local model uses. But your point stands.
reply
Unfortunately, like with the release of Qwen3.6-Plus, this model also isn’t released for local use. From the linked article: “Qwen3.6-Max-Preview is the hosted proprietary model available via Alibaba Cloud Model Studio”
reply
The Max series was never available for local use, though. So this is expected.
reply
Sure, not plus or max. I just use their lesser moe ones locally (that would never come close to massive sota models) all the time.
reply
Cost may or may not be a factor in my choice of model, but knowing the capabilities and knowing they will remain consistent, reliable, and available over time is always a dominant consideration. Lately, Anthropic in particular has not been great at that.
reply
anecdotally the quality of output isn't significantly different, the speed seems to be what you're really paying for, and since the alternative is free I'll stick to local.
reply
What are the best models to run locally?
reply
right now Gemma 4 and Qwen 3.6, I've found the latter to have the slight edge but your results may vary.
reply
Codex 5.4 is not out?
reply
Codex subscription is very generous at pro tiers
reply