upvote
The latest Kimi model is comparable in performance at least for these sorts of use cases, but yes it is harder to use locally.
reply
> harder to use locally

Which means most people must be using OpenClaw connected to Claude or ChatGPT.

reply