I'm very surprised at the quality of the new Gemma 4 models. On my 32 gig Mac mini I can be very productive with it. Not close to replacing paid AI by a long shot, but if I had to tighten the belt I could do it as someone who already knows how to program.
What's your setup/usecase? Enhanced intellisense?
love hearing this. and think about it, if the 2B is already doing this well on your mac mini, imagine what the 4B, 26B, or 31B can do on 32 gigs. with lower quantization you can fit pretty much any of them. if you want full precision you still have solid options at the 2B and 4B level. you're sitting on way more capability than you're probably using right now. the coding block on just the 2B scored 8.44 and caught bugs most people would miss. glad you're getting real use out of it, thanks for reading.