A South Korean man has been sentenced to jail for using artificial intelligence to generate exploitative images of children, the first case of its kind in the country as courts around the world encounter the use of new technologies in creating abusive sexual content.
So this does bring up an interesting point that I haven’t thought about - is it the depiction that matters, or is it the actual potential for victims that matters?
Consider the Catholic schoolgirl trope - if someone of legal age is depicted as being much younger, should that be treated in the same way as this case? This case is arguing that the depiction is what matters, instead of who is actually harmed.
That’s a good point. On the flip side, I remember there was a big deal about trying to flood the rhino horn market with fakes a few years ago. I can’t find anything on how that went, but I wonder if it could have that effect as well.
Personally, I feel that if making completely fictitious depictions of child porn, where no one is harmed (think AI-generated, or by consenting adults depicting minors) was legal, it might actually prevent the real, harmful ones from being made, thus preventing harm.
At the same time, an argument could be made that increasing the availability of such a thing could land it in the eyes of a person who otherwise wouldn’t have seen it in the first place and problems could develop.
It could normalize something absurd and create more risks.
I’m no expert and I’d rather leave it to people who thoroughly understand such behaviors to determine what is and isn’t ultimately more or less detrimental to the health of society.
I just know how (anecdotally) pornography desensitizes a person until it makes more extreme things less bizarre and unnatural. I can’t help but imagine a teenager who would have otherwise developed a more healthy sexuality stumbling on images like that and becoming desensitized.
It’s definitely something that needs some serious thought.
“I’m no expert and I’d rather leave it to people who thoroughly understand such behaviors to determine what is and isn’t ultimately more or less detrimental to the health of society.”
One of the big problems with addressing this problem is that NOBODY thoroughly understands these behaviors. They are so stigmatized that essentially nobody voluntarily admits to having pedophilic urges and scientists can only study those who actually act on them and harm children. They are almost certainly not a representative sample of the entire population of pedophiles, and this severely limits our ability to study the psychology of the population as a whole and what differentiates the rapists among them from the non-rapists.
Yeah, valid points, but it’s not gonna be easy to tell, in practice. Doing a proper scientific test is likely going to be unethical for obvious reasons, so we’re left to wonder if the cons outweigh the pros or not.
In America at least, people often confuse child pornography laws with obscenity laws, and they do end up missing the point. Obscenity laws are a violation of free speech, but that’s not what a CSAM ban is about. It’s about criminalizing the abuse of children as thoroughly as possible. Being in porn requires consent, and children can’t consent, so even the distribution or basic retention of this content violates a child’s rights.
Which is why the courts have thrown out lolicon bans on First Amendment grounds every time it’s attempted. Simulated CSAM lacks a child whose rights could be violated, and generally meets all the the definitions of art, which would be protected expression no matter how offensive.
It’s a sensitive subject that most people don’t see nuance in. It’s hard to admit that pedophilia isn’t a criminal act by itself, but only when an actual child is made a victim, or a conspiracy to victimize children is uncovered.
With that said, we don’t have much of a description of the South Korean man’s offenses, and South Korea iirc has similar laws to the US on this matter. It is very possible that he was modifying real pictures of children with infill or training models using pictures of a real child to generate fake porn of a real child. This would introduce a real child as victim, so it’s my theory on what this guy was doing. Probably on a public image generator service that flagged his uploads.
So this does bring up an interesting point that I haven’t thought about - is it the depiction that matters, or is it the actual potential for victims that matters?
Consider the Catholic schoolgirl trope - if someone of legal age is depicted as being much younger, should that be treated in the same way as this case? This case is arguing that the depiction is what matters, instead of who is actually harmed.
How I see it: creating fake child porn makes it harder for authorities to find the real ones.
That’s a good point. On the flip side, I remember there was a big deal about trying to flood the rhino horn market with fakes a few years ago. I can’t find anything on how that went, but I wonder if it could have that effect as well.
Also makes it harder for offenders to find the real ones!
Every country has different rules, standing on wikipedia.
Personally, I feel that if making completely fictitious depictions of child porn, where no one is harmed (think AI-generated, or by consenting adults depicting minors) was legal, it might actually prevent the real, harmful ones from being made, thus preventing harm.
At the same time, an argument could be made that increasing the availability of such a thing could land it in the eyes of a person who otherwise wouldn’t have seen it in the first place and problems could develop.
It could normalize something absurd and create more risks.
I’m no expert and I’d rather leave it to people who thoroughly understand such behaviors to determine what is and isn’t ultimately more or less detrimental to the health of society.
I just know how (anecdotally) pornography desensitizes a person until it makes more extreme things less bizarre and unnatural. I can’t help but imagine a teenager who would have otherwise developed a more healthy sexuality stumbling on images like that and becoming desensitized.
It’s definitely something that needs some serious thought.
“I’m no expert and I’d rather leave it to people who thoroughly understand such behaviors to determine what is and isn’t ultimately more or less detrimental to the health of society.”
One of the big problems with addressing this problem is that NOBODY thoroughly understands these behaviors. They are so stigmatized that essentially nobody voluntarily admits to having pedophilic urges and scientists can only study those who actually act on them and harm children. They are almost certainly not a representative sample of the entire population of pedophiles, and this severely limits our ability to study the psychology of the population as a whole and what differentiates the rapists among them from the non-rapists.
Yeah, valid points, but it’s not gonna be easy to tell, in practice. Doing a proper scientific test is likely going to be unethical for obvious reasons, so we’re left to wonder if the cons outweigh the pros or not.
Thanks for sharing that link. I hated reading through it, but it answered the question haha…
I don’t really have strong feelings about it but I do think I lean towards agreeing with you.
In America at least, people often confuse child pornography laws with obscenity laws, and they do end up missing the point. Obscenity laws are a violation of free speech, but that’s not what a CSAM ban is about. It’s about criminalizing the abuse of children as thoroughly as possible. Being in porn requires consent, and children can’t consent, so even the distribution or basic retention of this content violates a child’s rights.
Which is why the courts have thrown out lolicon bans on First Amendment grounds every time it’s attempted. Simulated CSAM lacks a child whose rights could be violated, and generally meets all the the definitions of art, which would be protected expression no matter how offensive.
It’s a sensitive subject that most people don’t see nuance in. It’s hard to admit that pedophilia isn’t a criminal act by itself, but only when an actual child is made a victim, or a conspiracy to victimize children is uncovered.
With that said, we don’t have much of a description of the South Korean man’s offenses, and South Korea iirc has similar laws to the US on this matter. It is very possible that he was modifying real pictures of children with infill or training models using pictures of a real child to generate fake porn of a real child. This would introduce a real child as victim, so it’s my theory on what this guy was doing. Probably on a public image generator service that flagged his uploads.
The intent is to get off on fucking children, how you make that happen shouldnt matter
So would that include written stories?