Agree with all this, but am not sure how it applies to this case. This seems rather the opposite behavior: accelerated bad de facto behavior because de jure enforcement is infeasible.
We are seeing this in the world of digital media, where frivolous DMCA and YouTube takedown reports are used indiscriminately and with seemingly little consequence to the bad actor. Corporations are prematurely complying with bad actors as a risk reduction measure. The de jure avenues to push back on this are weak, slow, expensive, and/or infeasible.
So if you ask me what's the bigger threat right now, stricter or less strict enforcement, I'd argue that it's still generally the latter. Though in the specific case of copyright I'd like to see a bunch of the law junked, and temporal scope greatly reduced (sorry not sorry, Disney and various literary estates), because the de facto effects of it on the digital (and analog!) commons are so insidious.
I'd say it's neither, it's laws failing to keep pace with technological development. All the precedent around clean-room engineering implicitly assumes it'll be painstakingly done by a team of humans taking months or years of work. This means that while there is a way around copyright, the effort it takes to reimplement something poses enough of a barrier that complying with the license is the easier option in most cases. If we treat AI the same way we treat humans here, it means that the barrier is gone. Their blog post brings up the example of Phoenix Software's reimplementation of the IBM PC BIOS. It took a team of engineers 4 months to write the initial version of that work. The authors were able to produce their own clean-room PC BIOS with zero human involvement in less than an hour. Currently both of these are treated as being legally equivalent.