upvote
Why shouldn't you? It's your device, it has hardware made specifically for inference.

What's to be gained, other than battery life, by offloading inference to someone else? To be lost, at least, is your data ownership and perhaps money.

reply
> What's to be gained... by offloading inference to someone else?

Access to models that local hardware can't run. The kind of model that an iphone struggles to run is blown out of the water by most low end hosted models. Its the same reason that most devs opt for claude code, cursor, copilot, etc. instead of using hosted models for coding assistance.

reply
But apparently this model is sufficient for what the OP wants to do. Also apparently it works on iPhone 15 and 17, but not on 16.
reply
Claude code produces stuff orders of magnitude more complicated than classifying expenses. If the task can be run locally on hardware you own anyway, it should.
reply
I would really not want to upload my expense data to some random cloud server, nope. On device is really a benefit even if it's not quite as comprehensive. And really in line with apple's privacy focus so it's very imaginable that many of their customers agree.
reply