Only if training new models leads to better models. If the newly trained models are just a bit cheaper but not better most users wont switch. Then the entrenched labs can stop training so much and focus on profitable inference
If they really have 40-60% gross margins, as training costs go down, the newly trained models could offer the same product at half the price.
If they really have 40-60% gross margins, as training costs go down, the newly trained models could offer the same product at half the price.