upvote
Furthermore, for the example given, it would have made a lot of sense to me to generate those article summaries on the backend. Once and for all, no need to burden each client device (which are going to need to download the content anyway), no need to tie yourself to a specific provider (Apple in this case), can have the same experience everywhere. Of course, the backend could use a local (to itself) model.

Not saying it’s _wrong_ either – maybe it doesn’t use a backend of its own (the client downloads content directly from some predefined set of sites), maybe there is functionality to adjust how the summaries work that benefit from doing it on device, etc. Just doesn’t convince me that ”local AI should be the norm”.

reply