Funny how people are suddenly on Elsevier's side. It's clear to me that AI training is transformative fair use under existing law. Maybe this will be the case to prove it.
I also find it funny, I said this regarding the other thread and article[0]
'"They then copied those stolen fruits"
How are these fruits "stolen" if they still have what was allegedley stolen?
Dowling v. United States, 473 U.S. 207 (1985): The Supreme Court ruled that the unauthorized sale of phonorecords of copyrighted musical compositions does not constitute "stolen, converted or taken by fraud" goods under the National Stolen Property Act
And even if, arguendo, sure its stolen. The purpose of copyright is to "To promote the Progress of Science and useful Arts, by securing for limited Times to Authors and Inventors the exclusive Right to their respective Writings and Discoveries"
And you would be hard pressed to prove that LLM's haven't advanced the arts and sciences, so at bare minimum transformative, ie fair use.'
>It's clear to me that AI training is transformative fair use under existing law.
I wouldn't even go that far. Its an entirely new product. Its like the guy who sold you the keyboard demanding royalties for the software you built.
That the person who wrote the book couldn't predict a new use case for the book in training LLMs, is irrelevant. The book isn't in the LLM. Its not being sold with the LLM. Its one of billions of tools used to create the LLM.
People try and sell this as the AI companies extracting value from the poor little IP holders like Disney. Its maddening. That content is your cultural heritage. It already belongs to you, just some idiot has been granted a lifetime of exclusive exploitation. An LLM is trained on data you already own. Disney et al wants to exploit the new technology to extract even more money out of stuff created often decades ago.
At absolute worst its reverse engineering, which was supposed to be fair use protected in the US but apparently that's been somewhat eroded.
Illegally obtaining copyrighted materials is usually the issue not the transformation part
Absorb all "our" IP without consent, in doing so remove "our" own source of revenue, and then repackage it as their own product. Not really fair use IMO.
It actually depends on evilness of the company. Elsevier is just less evil that Zuckerberg and Meta, while publishers are even less problematic. I dont think there is anything funny in that.
Or anything to defend on Meta. If they go out of business, humanity profits.
When you use millions of copyrighted materials to bundle together to produce a commercial product, I wouldn’t call that a fair use. Especially when licensing of such material doesn’t explicitly allow that, the material wasn’t even purchased on consumer markets and your commercial product may be a competitor/analogue to the copyrighted material.
Not even going to all GPL stuff, that in a better world should have screwed all the slop companies
The enemy of my enemy, and all that.
Elsevier is shitty to people doing stuff that (imo) should be allowed. Meta is making money doing the same thing and not getting the same shittiness from Elsevier.
Elsevier at least works within the (admittedly broken) system, Meta does not.
> It's clear to me that AI training is transformative fair use under existing law. Maybe this will be the case to prove it.
That is not what this case is about. It is more about the illegal violation and piracy of copyrighted content done by Meta for commercial use and Zuck knew they were doing it.
Why did Anthropic settle [0] with a multi-billion dollar payout to authors after commercializing their LLMs that was trained off of copyrighted content that was illegally obtained and kept without the authors permission?
There's a reason why they (Anthropic) did not want it to go to trial. (Anthropic knew they would lose and it would completely bankrupt them in the hundreds of billions.)
AI boosters will do anything to justify the mass piracy and illegal obtainment of copyrighted material for commercial use (not research) which that is not fair use in the US. There is no debate on this. [0]
[0] https://images.assettype.com/theleaflet/2025-09-27/mnuaifvw/...
If i could ask for a summary from an llm vs buy a book id go with the summary. That eats into commercial use and the supreme court case sided with Gerald Ford when a newspaper published a small gist of his autobiography because it ate into the sales
Such a garbage take. This is not a parody or a critique. Mark Zuckerberg is not Weird Al Yankovic.
It's not settled law so I'm not sure how that's clear to you.
I think this completely misses the point... the point is that Meta pirated the media they used to train their model.
I am not a fan of US copyright law, but if I torrented millions of books, I would be facing a felony charge in criminal court and a (with statutory damages as high as $150,000 per title in cases of willful infringement) multi-billion dollar lawsuit in civil court.
In my opinion, this has nothing to do with whether or not AI training is transformative and this fair use, and everything to do with whether or not the laws apply to everyone equally. If Facebook isn't forced to pay billions and elect a sacrificial executive to serve prison time, then I will remain angry.
I'm not on Elsevier's side, but I still think it's bullshit that giant companies are allowed to do things at a scale that I'd go to prison for.
"Funny" is how dishonest snipes are framed. It such a common trope of internet quips, it's wearing me out. Can we please try to just format our disagreements without the snideness?
[dead]
I find it grating that so many AI boosters try to frame pushing back against the AI industry as a sudden about-face for everyone that spent the last 20 years pushing back against the copyright industry. I’m also in favor of decriminalizing or legalizing small amounts of pot for personal use. That doesn’t mean I’m behind industrialized narcotic production on such a huge scale that it that it starts to distort the economy, and companies looking for new ways to add methamphetamine to every goddamn product.