Treating an AI-assisted rewrite as a legal bypass for the GPL is wishful thinking. A defensible path is a documented clean-room reimplementation where a team that never saw the GPL source writes independent specs and tests, and a separate team implements from those specs using black-box characterization and differential testing while you document the chain of custody.
AI muddies the water because large models trained on public repos can reproduce GPL snippets verbatim, so prompting with tests that mirror the original risks contamination and a court could find substantial similarity. To reduce risk use black-box fuzzing and property-based tools, have humans review and scrub model outputs, run similarity scans, and budget for legal review before calling anything MIT.
I'm somewhat confused on how it actually muddies the waters - any person could have read the source code before hand and then either lied about it or forgot.
Our knowledge of what the person or the model actually contains regarding the original source is entirely incomplete when the entire premise requires there be full knowledge that nothing remains.