Ok, but what if in the future I could guarantee that my generative model was not trained on the work I want to replicate. Like say X library is the only library in town for some task, but it has a restrictive license. Can I use a model that was guaranteed not trained on X to generate a new library Z that competes with X with a more permissive license? What if someone looks and finds a lot of similarities?
I wish you luck proving it wasn't trained on the original library or any work that infringed itself.
This is what Adobe ostensibly is trying to do with their GenAI image model, Firefly.
https://en.wikipedia.org/wiki/Adobe_Firefly