I am also on a $10/month plan with Nous Research for supplying open models for their open source Hermes Agent. I run Hermes inside a container, on a dedicated VPS as a coding agent for complex tasks and so far I find the $10/month plan is enough for about five to ten major tasks a month. I think it is also a good deal.
You're really not going to miss CC. And OpenAI actually had some foresight to invest massively in compute so they don't run into usage and rate limits like Anthropic does constantly. I couldn't even use CC for more than a couple complex tasks before I was out of extra usage or session usage. It was a maddening productivity killer and I just switched to Codex full time.
After all, we may be a just a data source and not their intended demographic all along.
ollama launch claude --model qwen3.6:35b-a3b-nvfp4
In addition to not having an integrated web search tool, one drawback is that it runs more slowly than using cloud servers. I find myself asking for a code or documentation change, and then spending two minutes on my deck getting fresh air waiting for a slower response. When using a fast cloud service I can be a coding slave, glued to my computer. Still, I like running local when I can!
If Anthropic’s move is confirmed, my guess is other coding agents providers might end up making similar moves