If you want LLMs to continue to be offered we have to get to a point where the providers are taking in more money than they are spending hosting them. And we still aren't there (or even close).
It is nobody's responsibility to ensure billion dollar companies are profitable. Use them until local models are good enough
The open models may not be as great but maybe these are good enough. AI users can switch when the prices rise before it becomes sustainable for (some) of the large LLM providers.
I think this has to be done with technological advances that makes things cheaper, not charging more.
I understand why they have to charge more, but not many are gonna be able to afford even $100 a month, and that doesn't seem to be sufficient.
It has to come with some combination of better algorithms or better hardware.
If they started doing caching properly and using proper sunrooms for that they'd have a better chance with that
I see the current situation as a plus. I get SOTA models for dumping prices. And once the public providers go up with their pricing, I will be able to switch to local AI because open models have improved so much.
[dead]
They are taking in more than they are spending hosting them. However, the cost for training the next generation of models is not covered.