logoalt Hacker News

danny_codesyesterday at 7:00 PM1 replyview on HN

I doubt that’s the case. My guess is we’ll hit asymptomatic returns from transformers, but price-to-train will fall at moore’s law.

So over time older models will be less valuable, but new models will only be slightly better. Frontier players, therefore, are in a losing business. They need to charge high margins to recoup their high training costs. But latecomers can simply train for a fraction of the cost.

Since performance is asymptomatic, eventually the first-mover advantage is entirely negligible and LLMs become simple commodity.

The only moat I can see is data, but distillation proves that this is easy to subvert.

There will probably be a window though where insiders get very wealthy by offloading onto retail investors, who will be left with the bag.


Replies

coldteayesterday at 9:15 PM

>I doubt that’s the case. My guess is we’ll hit asymptomatic returns from transformers, but price-to-train will fall at moore’s law.

There hasn't been a real Moore's law for a good while even before LLMs.

And memory isn't getting less expensive either...