Why would someone gladly provide their work as open source but draw the line at AI reading it and using that knowledge to help more programmers later? It makes no sense to me. I actively want all of my code to be read by AI.
Doesn't seem inconsistent to me. I may want my code to be open source so that other humans can read it, understand it, build on it, and contribute to it.
I may also have a philosophical opposition to generative AI at the same time - there are plenty of environmental, societal, and intellectual-property costs that some may find unconscionable.
It's kind of breaking the social contract. Licences were drafted, conferences were held, and endless flamewars tried to codify what it means to collaboratively build, distribute, own and use open software.
Then came the model trainers, ignoring the entire discourse, reasoning: "if I can download it, it's mine too use". And then basically selling the resulting tech back to the community.
Not unlike big tech extracting money from open source, but at least the latter usually (somewhat maliciously) complied with the license.
Consider a couple similar situations:
1. Many teachers don't publish, and those that do publish often still reserve their best for their students.
2. OS development sometimes operates like esoteric societies: you publish enough that people with the desire and insight become interested and engaged - both a filter and an invitation. So you can tailor the community you like.
Both depend on people really valuing these mutually-constitutive relationships.
My observation is that the generations raised on social media and gaming are happy enough with those substitutes, and view publishing their best work as a kind of self-promotion and participation in a larger, diffuse community (without a real role in governance). And they're right: expecting more personal communities now is a severely limiting factor, and AI removes most of the incentives to participate in someone else's project.
Open source is not necessarily about helping any programmer, for any endeavor. Actually, my code targets the end users, not fellow programmers.
I don't want my code to be used to build proprietary software. I want code built on top of mine to respect its users. I choose the AGPL for this reason.
I also don't mind the attribution.
The LLMs don't care about all that, and do that by hogging the resources, y creating a lot of waste and pollution and disrupting society for unclear benefits. No thanks.
When AI respects my license, sure.
[dead]
A couple of valid reasons:
+ they don’t want to pay the bandwidth costs
+ they don’t want to help train a model that might ultimately put them out of work.
I don’t personally agree that AI are taking out jobs, but I do think it’s still a reasonable concern others have so I would sympathise if that were the rationale.