That's what the money pays for when the Comment above mentions 'that you might have to eventually pay an AI company a large amount of money to ask ChatGPT such a question'
Putting aside that it won't be a large amount of money For any particular query , that's how the AI companies see themselves, not as providers of information, but as providers of mechanisms that provide information. It is not selling the Information of others, it isn't selling information at all. They are selling the service of running the mechanism.
I'm always going to have a machine anyway—might as well max out the RAM when I purchase another.
(And so too I jumped on the Mac mini bandwagon a month or two back—64 GB. I'm enjoying pulling down the new models and putting them through my paces.)
It doesn't look like they have a way to filter down to "open" models. By this of course I mean "downloadable, local models".
I suppose if you know the "family" (Gemma, Qwen, etc.), I can just go to those models and test…
I've simply been pulling down what is popular from the LM Studio front end (and what runs on my hardware) and testing in situ.