logoalt Hacker News

jherdmanyesterday at 5:39 PM3 repliesview on HN

Is this sort of setup tenable on a consumer MBP or similar?


Replies

danw1979yesterday at 5:55 PM

Qwen’s 30B models run great on my MBP (M4, 48GB) but the issue I have is cooling - the fan exhaust is straight onto the screen, which I can’t help thinking will eventually degrade it, given the thermal cycling it would go through. A Mac Studio makes far more sense for local inference just for this reason alone.

pitchedyesterday at 5:44 PM

For a 30B model, you want at least 20GB of VRAM and a 24GB MBP can’t quite allocate that much of it to VRAM. So you’d want at least a 32GB MBP.

show 3 replies
Gareth321yesterday at 9:11 PM

The Mac Minis (probably 64GB RAM) are the most cost effective.