> In the last year, token context window increased by about 100x and halved in cost at the same time.
So? It's nowhere close to solving the issue.
I'm not anti-LLM. I'm very senior at a company that's had an AI-centric primary product since before the GPT explosion. But in order to navigate what's going on now, we need to understand the strengths and weaknesses of the technology currently, as well as what it's likely to be in the near, medium, and far future.
The cost of LLMs dealing with their own generated multi-million LOC systems is very unlikely to become tractable in the near future, and possibly not even medium-term. Besides, no-one has yet demonstrated an LLM-based system for even achieving that, i.e. resolving the technical debt that it created.
Don't let fanboism get in the way of rationality.
> The cost of LLMs dealing with their own generated multi-million LOC systems is very unlikely to become tractable in the near future
If you have a concrete way to pose this problem, you'll find that there will be concrete solutions.
There is no way to demonstrate something as vague as "resolving the technical debt that it created".