logoalt Hacker News

devmoryesterday at 2:56 PM3 repliesview on HN

This is a real danger that I think a lot of people will run into as prices go up more and more in the future.

Completely outside of the productivity debate, offloading cognitive tasks to LLMs leaves you less practiced in them and less ready to do them when the LLM isn't available. When you have to delegate only certain tasks to the LLM for financial reasons, you may find yourself very frustrated.


Replies

johntashyesterday at 8:02 PM

I'm really hoping locally hosted llms get to the point of competing with current-day frontier models so that we all have "unlimited" usage.

3abitonyesterday at 8:58 PM

This is the bet of many of the big AI companies, and why they're subsidizing majorly the calls. With the latest cracks by the US gov, it seems Anthropic is starting to reduce those subsidies given their edge in the game. I am starting to consider local models more seriously beside just testing, but nowadays the ram/gpu market is bloated.

show 1 reply
cyanydeezyesterday at 11:16 PM

Seriously, who isnt planning a local first strategy?