upvote
> Different users. Many people care about privacy and aren’t using Meta products.

Yeah but if they can rake in 100x as much by making products for people who don't care about privacy, then why spend time developing stuff for people who care?

There is still a small market left, of course, but that market will not have the billions of R&D behind it.

reply
It's largely out of Meta's hands now anyway. The risk here not so much to privacy (it's Apple) but they'll walled garden the model space somehow for sure.
reply
> but they'll walled garden the model space somehow for sure.

People have said this since Pytorch was published and it's not any more true now than it was 10 years ago.

reply
70% of the world’s population use at least one Meta property at least once per day. How many of the other 30% are too poor/young/computer illiterate to be part of an addressable market?

Every company has dozens of SaaS products that store their business critical information. Amazon installs Office on each computer, Slack (they were moving away from Chime when I left), and the sales department uses SalesForce - SA’s and Professional Services (former employee).

The addressable market of even companies that care about privacy is not a large addressable market. How long will it be before computers become cheap enough that can run even GPT 4 level LLMs that companies will give it to all of their developers?

reply
The banking industry absolutely does care about privacy of their business data btw. We do use tools like Confluence but they're all hosted in our own data centers.
reply
And Capital One and Goldman Sachs are both hosted on AWS…
reply
These are all great statistics, but how do you explain ClawdBot explosion. Even in lower income countries like China. So much demand that Apple can’t keep up production of Mac Minis. Why aren’t these folks going towards cloud solutions? Is it cost or is there some consideration for having more control over their data?
reply
ClawBot doesn't generally run the model locally, it just talks to remote APIs. No different than any other agentic harness. You could run a local model on the same Mac Mini as your agent, but it wouldn't be very smart and many agentic tasks around computer GUI/browser use, etc. would be out of reach.
reply
They are running cloud models in almost all cases. Like saying it isn’t cloud when you use the Facebook app on your phone (it is ON your phone and running there).
reply
And people using Clawdbot are still not using local inference for the most part…

They aren’t buying high end $2000+ Mac Minis.

reply
> Why aren’t these folks going towards cloud solutions?

They are. The majority aren't doing inference on a Mac Mini, but instead using it as a local host for cloud-based inference. You could have the same general experience on a $200 Chromebook or $300 Windows box.

reply