Hacker News
new
past
comments
ask
show
jobs
points
by
Workaccount2
1 days ago
|
comments
by
gardnr
1 days ago
|
next
[-]
The 4-bit quant weighs 4.25 GB and then you need space for the rest of the inference process. So, yeah you can definitely run the model on a Pi, you may have to wait some time for results.
https://huggingface.co/unsloth/gemma-3n-E4B-it-GGUF
reply
by
refulgentis
23 hours ago
|
prev
|
[-]
See here, long story short, this is another in a series of blog posts that would lead you to believe this was viable, but it isn't :/
https://news.ycombinator.com/item?id=44389793
reply