The “illegally trained LLMs” they’re taking about are trained on copyrighted data that they didn’t have permission to use, this isn’t about LLMs that have been trained to do illegal things. OpenAI (chatgpt) is being sued because there is a lot of evidence that they used copyrighted content for training, like NY Times articles. OpenAI is so profitable that they’ll probably see these lawsuits as a business expense and keep doing it. Most people won’t sue anyway…
i know that by illegally trained LLMs they are talking about training on copyrighted data(by legally have access to, i meant that they are legally allowed to train AI on it).
Its ridiculous that companies can just ignore laws
Wouldnt that give people who is it for bad things easier access? It should be made illegal to create if they dont legally have access to that data
The “illegally trained LLMs” they’re taking about are trained on copyrighted data that they didn’t have permission to use, this isn’t about LLMs that have been trained to do illegal things. OpenAI (chatgpt) is being sued because there is a lot of evidence that they used copyrighted content for training, like NY Times articles. OpenAI is so profitable that they’ll probably see these lawsuits as a business expense and keep doing it. Most people won’t sue anyway…
i know that by illegally trained LLMs they are talking about training on copyrighted data(by legally have access to, i meant that they are legally allowed to train AI on it).
Its ridiculous that companies can just ignore laws
Oh, I’m not sure what you meant in your first comment then?