The new agent who writes code has probably at least parts of the original code as training data.
We can't speak about clean room implementation from LLM since they are technically capable only of spitting their training data in different ways, not of any original creation.
Only in the case of open source code
The conclusion of this would be that you can never license AI generated code since you can't get a release from the original authors.
Of course in practice it would work exactly in the opposite fashion and AI generated code would be immune even if it copied code verbatim.