Now we are facing an unprecedented growth of AI as a whole. Do you think is time for FSF elaborate a new version of GPL to incorporate the new challenges of AI in software development to keep protecting users freedom?
Richard Stallman talked about this topic there: https://framatube.org/w/1DbsMfwygx7rTjdBR4DPXp
Can’t find timestamp tho.
GPLv3 already takes all of that. Programs that train AI have normal licencing applied. Programs that was modified by AI must be under GPL too. The neural network itself if not a program, it’s a format and is always modifiable anyway as there is no source code. You can take any neural network and train it futher without data it was trained on before.
I keep saying “no” to this sort of thing, for a variety of reasons.
- “You can use this code for anything you want as long as you don’t work in a field that I don’t like” is pretty much the opposite of the spirit of the GPL.
- The enormous companies slurping up all content available on the Internet do not care about copyright. The GPL already forbids adapting and redistributing code without licensing under the GPL, and they’re not doing that. So another clause that says “hey, if you’re training an AI, leave me out” is wasted text that nobody is going to read.
- Making “AI” an issue instead of “big corporate abuse” means that academics and hobbyists can’t legally train a language model on your code, even if they would otherwise comply with the license.
- The FSF has never cared about anything unless Stallman personally cared about it on his personal computer, and they’ve recently proven that he matters to them more than the community, so we probably shouldn’t ever expect a new GPL.
- The GPL has so many problems (because it’s been based on one person’s personal focuses) that they don’t care about or isolate in random silos (like the AGPL, as if the web is still a fringe thing) that AI barely seems relevant.
I mean, I get it. The language-model people are exhausting, and their disinterest in copyright law is unpleasant. But asking an organization that doesn’t care to add restrictions to a license that the companies don’t read isn’t going to solve the problem.
The problem of recent AI is about fair use of data, not about copyright. To solve the AI problem, we need laws to stop abuse of data rather than to stop copying of code.
Too soon. The GPL is a license aligning prevalent copyright laws to some ideological goals. There are no prevalent copyright laws regarding AI yet, so there is nothing to base a copyright license on.
First step: introduce AI into copyright law (and pray The Mouse doesn’t introduce it first).
deleted by creator
I think if we want a GPLv4, it should not be made by the FSF.
But ironically they are the “owner” of the license, anyone can’t modify it
The GPL is a license made by the FSF, not sure who else could make a new version other than them. Other entities make their own licenses, which might or not be compatible with the GPL.
Why is that, out of curiosity?
The FSF is a non-working organization which refuses to let go of its horrible founder. I hoped it would move on, it didn’t and refused to despite massive amounts of community backlash. I no longer believe they should have any role in representing the Free Software movement.
I really like Stallman, the man that made me think about the importance of free software. In my opinion he is essential for free software movement even with some “controversial” ideas. I like the way he defends his ideas, is something rare nowadays.
I mean, I think his ideas on free software are good generally but his behaviour and opinions on other topics are pretty fucking terrible. I don’t understand why people want to defend that part. The FSF can function without him and defend the ideas of Free Software.
Indeed, his ideas are often very controversial, he is a old man with old habits and I think he has some deficiency in the way he communicates with people that are contrary to his ideas and this fact makes everything even worse. I don’t know what will be with FSF after him for good and bad.
Some of his ideas are very harmful and he is an abuser. I don’t know what to tell you.
It might be time to start thinking about it, however it will depend on the consensus among the legal system on weather you need to provide attribution through AI.
There is already consensus, it just hasn’t been concluded explicitly yet.
There is no “AI” and there’s no “learning”, so there’s no new unbeaten path in law. like some would make you believe. LLMs are data processing software that take input data and output other data. In order to use the input data you have to conform to its licensing, and you can’t hide behind arguments like “I don’t know what the software is doing with the data” or “I can’t identify the input data in the output data anymore”.
LLM companies will eventually be found guilty of copyright infringement and they’ll settle and start observing licensing terms like everybody else. There are plenty of media companies with lots of money with a vested interest in copyright.
That’s not how copyrights work. They only care about copying or replicating that data. The hint is in the name