upvote
They are also used by ML models that are deeply integrated in macos and ios without you knowing. Like object and text detection in images.
reply
And help in Photos, Final Cut Pro, and other apps.
reply
I wish they would (or wouldn't if they are) hook it up to the ios keyboard.
reply
If you strip away the branding, Apple has and continues to ship a ton of algorithms that likely use the ANE and end users can use CoreML to do the same.

Just some things that people will likely take for granted that IIRC Apple have said use the ANE or at least would likely benefit from it: object recognition, subject extraction from images and video, content analysis, ARKit, spam detection, audio transcription.

reply
Don’t forget FaceID and many of the image manipulation.

And while everyone else went to more powerful giant LLMs, Apple moved most of Siri from the cloud to your device. Though they do use both (which you can see when Siri corrects itself during transcription—you get the local Siri version corrected later by the cloud version).

reply
IIRC, FaceID has been a thing before ML entered the picture.
reply
Apple's OSes run a lot of local ML models for many tasks that aren't branded as Apple Intelligence, and they have done so for many years now.
reply
reply
This is a nice article. Thanks for sharing.
reply
You can convert your own ML models to MLX to use them; Apple Intelligence is not the only application.
reply
MLX does not run on NPUs AFAIK; just gpu and cpu. You have to use CoreML to officially run code on the neural engine.
reply
Even then there is no transparency on how it decides what runs on the ANE/GPU etc
reply
Correct. OS level stuff get first priority, so you can’t count on using it.
reply
Turns out third party actually gets priority for ANE
reply