logoalt Hacker News

mgrundyesterday at 7:46 AM0 repliesview on HN

I really really want to like local AI, but I highly doubt it will see wide adoption for a long time.

The additional up-front cost for hardware designed to run an LLM in addition to normal workload is unlikely to be accepted by most consumers.

The scale will be very constrained (like Apples on-device models which are small, heavily quantized, and have a small 4K token context window). It’s also terrible for battery life.

AI as it is implemented today is simply just computationally expensive and unless you put in dedicated hardware (like the ANE) for only this purpose - a large cost driver - I don’t really see it getting large scale adoption.

Companies will probably need a server-backed solution as fallback if they want reasonable user experience, so why even invest in diverse hardware support.