logoalt Hacker News

lbreakjaiyesterday at 10:42 PM2 repliesview on HN

From whom? OpenAI and Google? Who else has the sort of resources to train and run SOTA models at scale?

You just reduced the supply of engineers from millions to just three. If you think it was expensive before ...


Replies

simonwyesterday at 11:02 PM

> Who else has the sort of resources to train and run SOTA models at scale?

Google, OpenAI, Anthropic, Meta, Amazon, Reka AI, Alibaba (Qwen), 01 AI, Cohere, DeepSeek, Nvidia, Mistral, NexusFlow, Z.ai (GLM), xAI, Ai2, Princeton, Tencent, MiniMax, Moonshot (Kimi) and I've certainly missed some.

All of those organizations have trained what I'd class as a GPT-4+ level model.

show 1 reply
teaearlgraycoldyesterday at 11:04 PM

A tri-opoly can still provide competitive pressure. The Chinese models aren’t terrible either. Kimi K2.5 is pretty capable, although noticeably behind Claude Opus. But its existence still helps. The existence of a better product doesn’t require you to purchase it at any price.

show 1 reply