A bipartisan group of US senators introduced a bill Tuesday that would criminalize the spread of nonconsensual, sexualized images generated by artificial intelligence. The measure comes in direct response to the proliferation of pornographic AI-made images of Taylor Swift on X, formerly Twitter, in recent days.

The measure would allow victims depicted in nude or sexually explicit “digital forgeries” to seek a civil penalty against “individuals who produced or possessed the forgery with intent to distribute it” or anyone who received the material knowing it was not made with consent. Dick Durbin, the US Senate majority whip, and senators Lindsey Graham, Amy Klobuchar and Josh Hawley are behind the bill, known as the Disrupt Explicit Forged Images and Non-Consensual Edits Act of 2024, or the “Defiance Act.”

Archive

  • Dr. Moose
    link
    fedilink
    English
    77
    edit-2
    11 months ago

    What a weird populist law tbh. There’s already an established law framework that covers this: defamation. Not a lawyer but it seems like this should be addressed instead of writing up some new memes.

    They’ll use this as an opportunity to sneak in more government spyware/control is my guess.

    • @[email protected]
      link
      fedilink
      22
      edit-2
      11 months ago

      Not a lawyer but it seems like this should be addressed instead of writing up some new memes.

      Always interesting to see people who even admit that they don’t know, but they still have a rather strong opinion.

      • Dr. Moose
        link
        fedilink
        English
        2511 months ago

        So only lawyers can have an opinion on law and be allowed public discourse? Lol

        • @[email protected]
          link
          fedilink
          1011 months ago

          Obviously not. Everyone is allowed to voice their opinion and has to accept that other people might find his opinion stupid and tell them so.

          My point is more, that you seem on one hand to realize that it’s a complex matter and you lack the expert knowledge (I’m not a lawyer), but on other hand still feel the need to express your opinion. There is nothing inherently wrong with that. It’s extremely common. Just something I have fun pointing out.

        • @[email protected]
          link
          fedilink
          3
          edit-2
          11 months ago

          Nobody’s saying you should be barred from participating, you just rightfully look like an idiot while you do it.

        • TigrisMorte
          link
          fedilink
          111 months ago

          When that opinion is about what a Law does or does not cover? Yes, only a Lawyer’s opinion should be involved. What a Law should/n’t cover or how it should/n’t work? Layperson’s opinion is important.

          • Xhieron
            link
            fedilink
            English
            1611 months ago

            Hi, lawyer here.

            Everyone’s opinion about the law matters, including what it covers, whether it’s vague, whether it applies, etc. This is Lemmy–not court. We’re in the town square here. Drinking yourself through three years of law school doesn’t imbue you with magical abilities to interpret laws as though they were religious texts. It’s just an education–not a miracle. If lawyers always knew what the law meant and laypeople always didn’t, no one would be fretting over hotly anticipated SCOTUS opinions, because everyone would already know the outcome.

            But wouldn’t you know it, reasonable people sometimes disagree, and among those reasonable people, quite often, are non-lawyers.

            As it turns out, non-lawyers often have an outsized influence on the law. Did you know that Donald Trump has never been to law school? Unbelievable, right? But hard to fathom though it may be, the big orange idiot hasn’t sat in on a single hour of L1 Torts. In fact he may have never even have seen the inside of a law library. Yet his opinion about the law has a tremendous impact, bigger even than Dr. Moose’s, because checking the “went to law school” box really doesn’t mean a hell of a lot outside of very limited situations.

            Personally, I’m much more interested in Dr. Moose’s opinion on this law than I am Rudy Giuliani’s, or even Clarence Thomas’s (and both those guys went to law school), and it’s no bother to me that he’s not a lawyer. In fact, it’s probably a mark in his favor.

            If you’re not interested in his opinion because he’s not a lawyer, well hey, that’s totally allowed, but you can easily ignore his comments without being pedantic. Or maybe you could just concede that there’s probably a bunch of strong opinions you also hold on subjects on which you’re not an expert. In fact, the whole lot of omg-not-a-lawyer! non-lawyers pitching little fits in this comment thread probably have strong feelings about war even though many of them have probably never put on a uniform. They might have strong feelings about healthcare despite never having darkened the door of a medical school. Shit, we might all even have strong feelings about politics despite never having gotten a single vote in a single election, ever. Can you believe it?!

            Yeah. It’s just an opinion. If you’re gatekeeping ‘having an unqualified opinion’ you should probably just lock yourself in your house and bar the windows, 'cause it’s gonna be an uphill battle for you.

            • TigrisMorte
              link
              fedilink
              211 months ago

              My dear self declared Law involved personage. Nope. It matters not one whit what the layperson’s opinion about what a Law covers or not as the sole arbiters are the Judiciary. Any layperson’s opinion involved is a matter of “should” or "shouldn’t. They have no say in the final passed Law, only the Courts do. To claim otherwise is to pretend “sovereign citizen” is an actual thing.

              But the reality wasn’t important to you. Was it, law boi.

              • Xhieron
                link
                fedilink
                English
                7
                edit-2
                11 months ago

                Aww. You sound mad. Don’t be mad. Sorry if I got under your skin. Have a Coke and a smile.

                • TigrisMorte
                  link
                  fedilink
                  111 months ago

                  A: high sugar drinks are a leading cause of diabetes and should be avoided. I recommend rum instead.
                  B: don’t make me angry, you won’t like me when I’m, angry.
                  C: you are not the boss of me.

            • @[email protected]
              link
              fedilink
              111 months ago

              Everyone’s opinion about the law matters,

              Hard disagree, only opinion of people who actually read the law - matter on the topic. Everything else just creates more confusion. We are on the internet most people never bother to go and actually read what they are talking about - and that includes me.

      • sphericth0r
        link
        fedilink
        9
        edit-2
        11 months ago

        I know, Congress should be ashamed of themselves. We would be hard pressed to find a group that had a worse understanding of technology

    • @[email protected]
      link
      fedilink
      1711 months ago

      It’s not defamation. And the new law will likely fail to hold up to 1A scrutiny, if the description of it is accurate (it often is not, for multiple reasons that include these bills generally changing over time). This is more of a free speech issue than photoshopping someone’s head onto someone else’s nude body, because no real person’s head or body is involved, just an inhumanly good artist drawing a nude, and on top of that the law punishes possession, not just creation.

      An example question any judge is going to have for the prosecutor if this goes to trial is how the image the law bans is meaningfully different from writing a lurid description of what someone looks like naked without actually knowing. Can you imagine going to jail because you have in your pocket a note someone else wrote and handed you that describes Trump as having a small penis? Or a drawn image of Trump naked? Because that’s what’s being pitched here.

      • Dr. Moose
        link
        fedilink
        English
        611 months ago

        It actually proposes “possession with the intention to distribute” which just show what a meme law this is. How do you determine the intention to distribute for an image?

        And I disagree with your take that this can’t be defamation. Quick googling says the general consensus is that this would fall in the defamation family of laws which makes absolute sense since a deepfake is an intentional misrepresentation.

        • @[email protected]
          link
          fedilink
          211 months ago

          I guess if you have AI generate the senate house speaker fucking her in the ass in an alley full of trash while she holds money bags, it’s then political satire and protected?

    • @[email protected]
      link
      fedilink
      511 months ago

      Even better: Intentional infliction of emotional distress

      There are business interests behind this. There is a push to turn a likeness (and voice, etc.) into an intellectual property. This bill is not about protecting anyone from emotional distress or harm to their reputation. It is about requiring “consent”, which can obviously be acquired with money (and also commercial porn is an explicit exception). This bill would establish this new kind of IP in principle. It’s a baby step but still a step.

      You can see in this thread that proposing to expand this to all deepfakes gets a lot of upvotes. Indeed, there are bills out there that go all the way and would even make “piracy” of this IP a federal crime.

      Taylor Swift could be out there, making music or having fun, while also making money from “her consent”, IE by licensing her likeness. She could star in movies or makes cameos by deepfaking her on some nobody actor. She could license all sorts of youtube channels. Or how about a webcam chat with Taylor? She could be an avatar for ChatGPT, or she could be deepfaked onto one of those Indian or Kenyan low-wage workers who do tech support now.

      We are not quite there yet, technologically, but we will obviously get there soonish. Fakes in the past were just some pervs who were making fan art of a sort. Now the smell of money is in the air.

      • Dr. Moose
        link
        fedilink
        English
        411 months ago

        This seems like the most likely scenario tbh. I’m not sure whether personal likeness IP is a bad thing per se but one thing is sure - it’s not being done to “protect the kids”.

        • @[email protected]
          link
          fedilink
          311 months ago

          personal likeness IP is a bad thing

          It is. It means that famous people (or their heirs, or maybe just the rights-owner) can make even more money from their fame without having to do extra work. That should be opposed out of principle.

          The extra money for the licensing fees has to come from somewhere. The only place it can come from is working people.

          It would mean more inequality; more entrenchment of the current elite. I see no benefit to society.

          • Dr. Moose
            link
            fedilink
            English
            111 months ago

            Not necessarily I’m optimistic that this could lead to empowering status and personality as main resources and push money out of society.

            • @[email protected]
              link
              fedilink
              111 months ago

              How so? Fame is already a monetizable resource. The main changes that I see are that 1) no opportunity to show their face and make their voice heard needs to be missed for lack of time, and 2) age no longer needs to be a problem.

    • @[email protected]
      link
      fedilink
      1
      edit-2
      11 months ago

      When you steal a person’s likeness for profit or defame them, then that’s a CIVIL matter.

      This bill will make AI sexualization a CRIMINAL matter.

      • Dr. Moose
        link
        fedilink
        English
        111 months ago

        Where do you see that?

        The Disrupt Explicit Forged Images and Non-Consensual Edits (DEFIANCE) Act would add a civil right of action for intimate “digital forgeries” depicting an identifiable person without their consent, letting victims collect financial damages from anyone who “knowingly produced or possessed” the image with the intent to spread it.

        • @[email protected]
          link
          fedilink
          1
          edit-2
          11 months ago

          Here:

          A bipartisan group of US senators introduced a bill Tuesday that would criminalize the spread of nonconsensual, sexualized images generated by artificial intelligence.

          • Dr. Moose
            link
            fedilink
            English
            111 months ago

            That doesn’t seem to be correct. More like a typo as criminalize =/= criminal law.