Each of these GPUs pull up to a kilowatt of power. The average commercial power cost is 13.4 ¢/kWh. That means running a single H100 full tilt 24/7 is a power operationing cost of $1,100 per card per year.
In three years the current generation of GPUs will be 50% or more faster. In six years your talking more than 100% faster. For the same energy costs.
If you're running a GPU data center on six year old GPUs, your cost to operate per sellable unit of work is double the cost of a competitor.
Sure. But if that fully depreciates, $1100/year GPU produces $20k of economic benefit, would you decommission it as long as there is demand?
One thing I am not entirely sure if there will be huge efficiency gains. Just looking at TDP that is the power consumption of say 3090 and 5090 and the increase is substantial then compare it to performance and the performance lift stops looking that great...