logoalt Hacker News

bandramitoday at 8:25 AM7 repliesview on HN

If you want LLMs to continue to be offered we have to get to a point where the providers are taking in more money than they are spending hosting them. And we still aren't there (or even close).


Replies

hobomtoday at 10:54 AM

They are taking in more than they are spending hosting them. However, the cost for training the next generation of models is not covered.

show 1 reply
Larrikintoday at 1:52 PM

It is nobody's responsibility to ensure billion dollar companies are profitable. Use them until local models are good enough

quikoatoday at 9:39 AM

The open models may not be as great but maybe these are good enough. AI users can switch when the prices rise before it becomes sustainable for (some) of the large LLM providers.

show 1 reply
carefree-bobtoday at 8:29 AM

I think this has to be done with technological advances that makes things cheaper, not charging more.

I understand why they have to charge more, but not many are gonna be able to afford even $100 a month, and that doesn't seem to be sufficient.

It has to come with some combination of better algorithms or better hardware.

show 2 replies
baruchtoday at 11:28 AM

If they started doing caching properly and using proper sunrooms for that they'd have a better chance with that

show 1 reply
lynx97today at 8:36 AM

I see the current situation as a plus. I get SOTA models for dumping prices. And once the public providers go up with their pricing, I will be able to switch to local AI because open models have improved so much.

nimchimpskytoday at 9:33 AM

[dead]