For certain types of "human in the loop". If it can't write working code without a human in the loop then it's not AGI. But a human-level coder also has lots of humans in the loop: a more senior developer doing code review, several layers of management, a product owner that interfaces the project with outside reality, sales people, etc.
Now I already hear you typing "but those roles should also be handles by AI if it's AGI" and I agree that an AI that can claim to be AGI should be able to handle those roles (as separate agents if necessary). But in a real setup it probably won't be the best choice to do those roles for cultural and legal reasons. Or it might simply not be cost effective. Not to mention that under most definitions of AGI there can still be humans more capable than the AI, as long as the AI hits the 50th percentile mark or something like that. So even if it's an AGI with the ability to do these roles we will still have humans in the loop for a long long time
> Now I already hear you typing "but those roles should also be handles by AI if it's AGI" and I agree that an AI that can claim to be AGI should be able to handle those roles (as separate agents if necessary). But in a real setup it probably won't be the best choice to do those roles for cultural and legal reasons.
But today you can't do those with AI, meaning the AI isn't AGI. I agree we will probably have humans in the loop here and there even after we achieve AGI for various reasons, but today you need to have humans in the loop it isn't an option not to.