2. They aren't harvesting your data for government files or training purposes
3. They won't be altered overnight to push advertising or a political agenda
4. They won't have their pricing raised at will
5. They won't disappear as soon as their host wants you to switch
What are you doing with it?
Why do you want it?
None of them are as good as the big hosted models, but you might be surprised at how capable they are. I like running things locally when I can, and I also like not worrying about accidentally burning through tokens.
I think the future is multiple locally run models that call out to hosted models when necessary. I can imagine every device coming with a base model and using loras to learn about the users needs. With companies and maybe even households having their own shared models that do heavier lifting. while companies like openai and anhtropic continue to host the most powerful and expensive options.
I still don’t understand. What are you using this long you’re running locally to actually do?
What is the use case?