logoalt Hacker News

KerrickStaleytoday at 4:58 PM0 repliesview on HN

I think (without having done extensive research) that some sort of Apple hardware is your best bet right now. Apple hasn’t raised RAM upgrade prices [1] (although to be fair their RAM upgrades were hugely inflated before the crunch) and their high memory bandwidth means they do inference faster than most consumer GPUs.

I have an M4 MacBook Air with 24 GB RAM and it doesn’t feel sufficient to run a substantial coding model (in addition to all my desktop apps). I’m thinking about upgrading to an M5 MacBook Pro with much more RAM, but I think the capabilities of cloud-hosted models will always run ahead of local models and it might never be that useful to do local inference. In the cloud you can run multiple models in parallel (e.g. to work on different problems in parallel) but locally you only have a fixed amount of memory bandwidth so running multiple model instances in parallel is slower.

[1] https://9to5mac.com/2026/03/03/apple-macbook-price-increase-...