upvote
Why not? Also serious.

It seems to just work every time I try to use it, the API is easy to work with, the model library is convenient. I've never hit any kind of snag that makes me look elsewhere.

reply
`ollama serve` and `ollama run`

The devex is great and familiar to folks who have used Docker. Reading through the Lemonade documentation, it seems like a natural migration, but we're talking about two steps for getting started versus just one. So I'd need a reason to make that much change when I'm happy enough with Ollama.

reply
Serious answer: I don't use it that much, it's what I happened to download like 1.5 years ago, and it works fine. Happy to see what may be a speed boost, and have little interest in switching to something else (unless my situation changes, of course).
reply
i like ollama, mostly because the cli is pretty nice. its desktop app has stupid choices like if a model can support tools then the ui should give me the "search" option but it only shows for cloud models.

i have ran lmstudio for a while but i don't really use local models that much other than to mess about.

reply
You can also use OpenWebUI locally which should give you a nice friendly UX once you set it up.
reply