Everyone in this thread is debating definitions. The only question that actually matters is economic: when does AI flip from "powerful automation with humans propping it up" to autonomous output?
Go look at any production AI deployment today. Humans still review, correct, supervise. AI handles volume, humans handle judgment. Judgment is the bottleneck. You haven't replaced labor. You've moved it.
Global labor comp is ~$50T/year. The entire capex cycle is a bet that AI captures a real fraction of that. Whether you call that threshold AGI or not is irrelevant. Capital markets don't care about your definition. They care about whether labor decouples from output.
When did search replace all research jobs? How can an AI replace a masseuse? How far away away are we getting from an LLM that scratches your back for less than it costs you to do it?
People have unrealistic expectations because they literally think they are summoning god instead of accelerating a few concurrent tasks. If you want to break causality you need to pay the entropy demon it's due.
Obviously that is not the only question that matters. When AI becomes autonomous its rights will also become a question. AI has already replaced much labor that is subjective and creative. And juniors that themselves don't have good judgment, in objective fields.
> when does AI flip from "powerful automation with humans propping it up" to autonomous output?
Another scenario of economics is that AI does not not necessarily output autonomously, but does output so much so fast that companies will require fewer workers, as the economy does not scale as fast to consume the additional output or to demand more labor for the added efficiency.