If those cost of compute is going down, then eventually it will go down enough that we will run on our LLMs locally and Anthropic will go out of business.
The industry will shift, yes. At some point, remote LLM compute will be like AWS.
Everyone can do baremetal at home and run on it, or VMs, containers. Many don't.
However, you'll still want the best model and toolset. So there is some place for them to pivot to. Something for them to sell or licence.
It will be interesting to see where the all lands, a decade from now. Who will be left?
> then eventually it will go down enough that we will run on our LLMs locally and Anthropic will go out of business.
I want robust local LLMs as much as the next person—Gemma E2B, 3.2GB does my word completions as I type. It's gotten to the point where it knows what I'm going to type before I do!
But I don't see Anthropic going out of business anytime soon. As good as some of the open source LLMs are, we’re still a long way from being able to frontier models at home.