Yeah that's a good callout for sure, the spending here is nuts so agree that it's not "just another business that has to price itself right to be competitive".
I guess if the time horizons is long, like 20 years, then maybe the spending, as it begins to amortize, gets more in line?
I was thinking that a comparison could be to cloud providers, each of which had to spend a lot of money to build out datacenter before making money. Difference there is AWS proved the product first, so when Microsoft and Google came along, they knew it would work and be profitable. With AI, nobody has proven it will work and be profitable, they're all competing for that at the same time which is a potentially dangerous mix for the reasons you cited.
The only way that this even vaguely works, best I can tell, would be on that decade-or-two timeline, but therein lies the problem: all this money getting pumped into data centers right now is going to produce data centers that are running old, inefficient, slow GPUs by 5-years-from-now standards. And GPUs are by far the most expensive part of these data centers… having the buildings is barely an asset. We’re investing all the money in right now’s technology in one of the fastest moving hardware segments and for some inexplicable reason, think that will lead to a sustainable advantage. What’s to stop someone 5 years from now, waiting for the dust to settle, then spending way less money for more compute and just mopping the floor with everybody in this sector… and that’s (unreasonably, IMO) assuming that local applications won’t become good enough to take too large a bite from their business before that.
And look at the difference in spending between their building out general-purpose-computing cloud data centers that even then, had potential use cases if the business failed. What are they going to do… start a massive, extremely expensive pre-rendered online gaming service? Only render Disney movies?
I dunno. None of this makes sense to me.