upvote
It's difficult because even if the underlying model is very good, not having a pre-built harness like Claude Code makes it very un-sticky for most devs. Even at equal quality, the friction (or at least perceived friction) is higher than the mainstream models.
reply
OpenCode? Pi?

If one finds it difficult to set up OpenCode to use whatever providers they want, I won't call them 'dev'.

The only real friction (if the model is actually as good as SOTA) is to convince your employer to pay for it. But again if it really provides the same value at a fraction of the cost, it'll eventually cease to be an issue.

reply

    "If one finds it difficult to set up OpenCode to use whatever providers they want, I won't call them 'dev'."

I feel the same way. But look at the ollama vs llama.cpp post from HN few days back and you will see most of the enthusiasts in this space are very non technical people.
reply
I think you mean ollama vs llama.cpp.
reply
I do!

Damn autocorrect :)

reply
I call it autocorrupt :)
reply
You can literally run it from Claude code. Easily too
reply
They have instructions right on their page on how to use claude code with it.
reply
[flagged]
reply