THE SENATE UNANIMOUSLY passed a bipartisan bill to provide recourse to victims of porn deepfakes — or sexually-explicit, non-consensual images created with artificial intelligence.

The legislation, called the Disrupt Explicit Forged Images and Non-Consensual Edits (DEFIANCE) Act — passed in Congress’ upper chamber on Tuesday.  The legislation has been led by Sens. Dick Durbin (D-Ill.) and Lindsey Graham (R-S.C.), as well as Rep. Alexandria Ocasio-Cortez (D-N.Y.) in the House.

The legislation would amend the Violence Against Women Act (VAWA) to allow people to sue those who produce, distribute, or receive the deepfake pornography, if they “knew or recklessly disregarded” the fact that the victim did not consent to those images.

  • Todd Bonzalez@lemm.ee
    link
    fedilink
    arrow-up
    6
    ·
    4 months ago

    All this law does is give victims the right to sue people who make involuntary porn of them, and clearly defines AI technology as something that you can sue over when it is used to simulate your likeness.

    The 1st amendment doesn’t matter for a civil matter like this. Libel and Slander are protected forms of speech, but you are still liable for the damage done when you intentionally lie about people. Likewise, you have the right to make whatever kind of art you want, but if you make art depicting private citizens in a pornographic context without their consent, the person you are depicting now has the right to seek legal damages for your abuse.

    I am a firm believer in the concept that “Your rights end where mine begin”. You have the right to make art of me if you please, and I have the right to seek damages from you if your art slanders, defames, or sexualizes me in a pornographic way without my consent. Those are things that do real world damage, so I see no issue with victims of these things being given a voice.