> Won’t these H100s drop in price in a few years
Doubtful. The increase in demand is greatly outpacing supply, and all signs point to a continued acceleration in demand
> If I could drop $10,000 to have an effectively permanent opus 4.7 subscription today, I would.
lol well obviously, but realistically that price point is going to be closer to $100k, with a perpetual $1k a month in power costs.
Cool, thanks for the information. I guess they drive prices down by massively parallelizing requests on say an H100 X8 array? So this is spread across. So if I say, wanted to use it for 8 hours a day in my theoretical world it’d be too expensive. My work definitely wouldn’t pay $100,000 for a server farm even if it’d give an AI to all our employees, you’d have to have engineers, a colocation space, basically all the problems that companies didn’t like and went to AWS for.
Why? These models are going to keep drastically improving and given all the new data centers token prices will probably drop a lot in the future. Seems shortsighted given the absurd timelines these things have been improving on.
taalas!!!
We tend to overestimate the short-term change, while underestimating the long term impact. A lot of hot air will likely vent when businesses realize LLMs didn't magically replace their workforce. Also, prices will go through the roof when energy production inevitably fails to keep up with demand for compute. Also, Moore's law more or less predicts we'll have today's technology in our phones in less than a decade.
I predict the B200 data centers we're build today will be obsolete in 3 years and we'll be using whatever models and hardware that isn't even on a road map today. Likely not NVIDIA, likely not OpenAI or Anthropic. Maybe Chinese?
In the mean time, we must continue building software with the clumsy coding agents tied to cloud services as this (for now) seems to be about the only area where AI economically makes sense.