No. AiCore service internally uses the inference on Tensor (http://go/android-dev/ai/gemini-nano)
> Is there anything at all from anyone I can download that'll run it on Tensor?
No.
> If there isn't, why not? (i.e. this isn't the first on device model release by any stretch, so I can't give benefit of the doubt at this point)
Mostly because 3P support has not been a engineering priority.
Got it: assuming you're at Google, in eng. parlance, it's okay if it's not Prioritized™ but then product/marketing/whoever shouldn't be publishing posts around the premise it's running 60 fps multimodal experiences on device.
They're very, very, lucky that ratio of people vaguely interested in this, to people follow through on using it, is high, so comments like mine end up at -1.
https://ai.google.dev/edge/litert/android/npu/overview has been identical for a year+ now.
In practice Qualcomm and MediaTek ship working NPU SDKs for third party developers, NNAPI doesn't count and is deprecated anyway.
(n.b. to readers, if you click through, the Google Pixel Tensor API is coming soon. So why in the world has Google been selling Tensor chips in Pixel as some big AI play since...idk, at least 2019?)
On third party model workloads, this is what you will get:
https://ai-benchmark.com/ranking.html
https://browser.geekbench.com/ai-benchmarks (NPU tab, sort w/ quantisation and/or half precision)
Google is clearly not serious on Pixels in practice, and the GPU performance is also behind by quite a lot compared to flagships, which really doesn't help. CPUs are also behind by quite a lot too...