From whom? OpenAI and Google? Who else has the sort of resources to train and run SOTA models at scale?
You just reduced the supply of engineers from millions to just three. If you think it was expensive before ...
A tri-opoly can still provide competitive pressure. The Chinese models aren’t terrible either. Kimi K2.5 is pretty capable, although noticeably behind Claude Opus. But its existence still helps. The existence of a better product doesn’t require you to purchase it at any price.
> Who else has the sort of resources to train and run SOTA models at scale?
Google, OpenAI, Anthropic, Meta, Amazon, Reka AI, Alibaba (Qwen), 01 AI, Cohere, DeepSeek, Nvidia, Mistral, NexusFlow, Z.ai (GLM), xAI, Ai2, Princeton, Tencent, MiniMax, Moonshot (Kimi) and I've certainly missed some.
All of those organizations have trained what I'd class as a GPT-4+ level model.