logoalt Hacker News

somewhereoutthyesterday at 10:49 PM1 replyview on HN

depends if post-correction it is worth anyone's money to keep training new frontier models. It could be that it isn't, so we are left with models that were trained in the bubble, but are now increasingly out of date, or (open?) models that are trained much more cheaply somehow with consequent lack of utility.


Replies

cmiles8yesterday at 10:53 PM

Good point. At some point there will be a reality check for the giant pile of burning cash that is new model training.