If you run it on your own hardware, you can run it 24/7 without worrying about token price or reaching the subscription limits and it is likely that you can do more work, even on much slower hardware. Customizing an open-source harness can also provide a much greater efficiency than something like Claude Code.
For any serious application, you might be more limited by your ability to review the code, than by hardware speed.
I have downloaded Kimi-K2.6 (the original release).
du -sh moonshotai/Kimi-K2.6
555G moonshotai/Kimi-K2.6
du -s moonshotai/Kimi-K2.6
581255612 moonshotai/Kimi-K2.6
For comparison (sorted in decreasing sizes, 3 bigger models and 3 smaller models, all are recently launched): du -sh zai-org/GLM-5.1
1.4T zai-org/GLM-5.1
du -sh XiaomiMiMo/MiMo-V2.5-Pro
963G XiaomiMiMo/MiMo-V2.5-Pro
du -sh deepseek-ai/DeepSeek-V4-Pro
806G deepseek-ai/DeepSeek-V4-Pro
du -sh XiaomiMiMo/MiMo-V2.5
295G XiaomiMiMo/MiMo-V2.5
du -sh MiniMaxAI/MiniMax-M2.7
215G MiniMaxAI/MiniMax-M2.7
du -sh deepseek-ai/DeepSeek-V4-Flash
149G deepseek-ai/DeepSeek-V4-Flash