logoalt Hacker News

turnsoutyesterday at 5:21 PM2 repliesview on HN

This looks amazing given the parameter sizes and capabilities (audio, visual, text). I like the idea of keeping simple tasks local. I’ll be curious to see if this can be run on an M1 machine…


Replies

Fergusonbyesterday at 5:27 PM

Sure it can, easiest way is to get ollama, then `ollama run gemma3n` You can pair it with tools like simonw's LLM to pipe stuff to it.

bigyabaiyesterday at 5:25 PM

This should run fine on most hardware - CPU inference of the E2B model on my Pixel 8 Pro gives me ~9tok/second of decode speed.