> If it's actually 'laundering' then it's invalid to begin with.
It's laundering in any reasonable meaning of the word. Whether it's legal according to the letter of the law is being decided.
Please differentiate morality and legality as well as intent and letter of the law.
> If anything AI code is cheap enough to even things out.
1) Do you think people have and will have access to the same models as large corporations internally, especially those who train LLMs themselves? Nothing stopping Google from excluding its own source code from the publicly available models but including it for internal models.
2) It's not just about the code, it's about the whole pipeline from nothing to a finished product and revenue stream. Did you know half the price of a new car is marketing? How much you can spend on ads, legal, market research, sales reps, etc. In some areas, especially B2B, nobody will even talk to you if you're a single guy in a shed, companies want stability, predictability and long term support.
3) More crudely, if you wanted to influence product selection or government elections, how many tokens could you afford for LLMs to influence online discussions, how many residential IPs could you afford, how much data could you buy about users to target each one specifically? Rich people will clearly have an advantage there.
Basically, if the cost of code goes towards zero, other factors will play a larger role.
> I think a lot of those people are consistent!
Only if they're consistently applying the rules to others but not themselves. Otherwise "permanent and total control over any idea they have" means they could never base anything on other people's ideas.
It's silly to say a human writing a piece of software is laundering their knowledge of existing software, even if they're trying to make a competitor to a specific thing. Legally and morally.
It's just a silly to say it's laundering when a machine does it.