So they should provide an interface to LLMs, disabled by default, enabled when users want it, and that’s it imho.
That also gives me the choice of which LLM provider to use, rather than being locked in whatever LLM Apple decided to do put in their OS.
I want to give Claude access to the stuff Apple Intelligence has access to, for example.
Wow. I had no idea that people would misinterpret what I was saying in this way. I was not meaning to imply it was an expectation of users or developers. I was meaning it as a statement of what was currently a growing industry trend by OS and browser vendors, of shipping or preparingto ship LMs.
By now the statement could probably be amended from "expected to gain access to" to "shipping with".
I hope the team maintaining the project now makes such an update, since apparently it's confusing so many people!
In theory it's useful. If devs can rely on local models, it's more private and decentralized, they don't need to funnel money to AWS or Anthropic. There are low-stakes use cases that only make sense if they're local (available offline) and free.
But in practice I've seen zero adoption of Apple Foundation Models in native apps. I wonder if any Mac/iOS devs have anything to share on this.
As for Apple foundational models, I think the issue is more that they’re just not very intelligent or good; maybe WWDC will change that; but if you want to implement LLM functionality, you’re better off either calling an API, or shipping a better small on device model.
I find many frustrating. I had an iphone previously and the llm summaries of text messages are what drove me to finally drop ios. I have a family member who is undergoing cancer treatment. I can't explain to you the frustration of seeing wrong text summaries when an llm goes wild hallucinating test results when the actual text simply said taking a test. OS basics and communication should be trustable. Not perhaps hallucinations of a small shitty model.
What are you trying to say?
So my question is: are browsers and operating systems really expected to gain access to language models? If so - by whom: the users or LLM vendors like Google?
GP is clearly asking ”Are they?”
I hate having to “dodge” all the AI-enabled controls my phone (iOS) is sprouting - I don’t need that shit, but there’s also no alternative.
We've seen this sort of song and dance before, crypto jumps to mind. Remember when social media sites suddenly were all about those hexagonal avatars? Most of this stuff is really in that same vein.
(Which to be clear, users don't want this. AI pushes by pretty much all recent user feedback metrics are largely tiring out users and reek of corporate desperation to sell shit. It's only a very specific subsection of Silicon Valley that wants to stuff AI in everything like this.)
A lot of these products feel unguided by an “everything must become AI” FOMO movement, rather than actual thoughtful integrations.
Operating Systems: Windows (built-in Copilot), MacOS, iOS (Apple Intelligence)
So it's >90% desktop browser and OS, plus >30% mobile OS.
Yes, I think it's very safe to say "browsers and operating systems are increasingly expected to gain access to language models."
I think this API is probably fine, but only if the user already has a model downloaded and wants these features. Naturally, case in point, Chrome quietly downloads Gemini Nano without any opt-out except through group policy. Things like this and Microsoft’s recent admission that they’ve overindexed on Copilot features in Windows make it increasingly difficult to trust that users actually want more than a few killer AI features, most of which are just ChatGPT.
Anecdotally, non-technical friends and family members know about ChatGPT and increasingly Gemini, get frustrated by Copilot, and don’t know Apple Intelligence exists.
https://superuser.com/questions/1930445/can-i-delete-the-chr...
The only people who expect them to do so are big tech executives. The average user does not expect nor want Copilot shoved into every possible corner of Windows, and Microsoft themselves have acknowledged this.