Julia, 21, has received fake nude photos of herself generated by artificial intelligence. The phenomenon is exploding.

“I’d already heard about deepfakes and deepnudes (…) but I wasn’t really aware of it until it happened to me. It was a slightly anecdotal event that happened in other people’s lives, but it wouldn’t happen in mine”, thought Julia, a 21-year-old Belgian marketing student and semi-professional model.

At the end of September 2023, she received an email from an anonymous author. Subject: "Realistic? “We wonder which photo would best resemble you”, she reads.

Attached were five photos of her.

In the original content, posted on her social networks, Julia poses dressed. In front of her eyes are the same photos. Only this time, Julia is completely naked.

Julia has never posed naked. She never took these photos. The Belgian model realises that she has been the victim of a deepfake.

  • @[email protected]
    link
    fedilink
    444 months ago

    This is going to be a serious issue in the future - either society changes and these things are going to be accepted or these kind of generating ai models have to be banned. But that’s still not going to be a “security” against it…

    I also think we have to come up with digital watermarks that are easy to use…

    • Justin
      link
      fedilink
      English
      314 months ago

      Honestly, I see it as kinda freeing. Now people don’t have to worry about nudes leaking any more, since you can just say they’re fake. Somebody starts sending around deepfakes of me? OK, whatever, weirdo, it’s not real.

      • Dadd Volante
        link
        fedilink
        354 months ago

        I’m guessing it’s easier to feel that way if your name is Justin.

        If it was Justine, you might have issues.

        Weird how that works.

        • Justin
          link
          fedilink
          English
          214 months ago

          Fair enough. Ideally it would be the same for women too, but we’re not there as a society yet.

          • @[email protected]
            link
            fedilink
            12
            edit-2
            4 months ago

            Such an empty response. Do you know that women have to do things on dates out of fear of being killed? Literally they have a rational fear of being killed by their male dates and it’s a commonly known and accepted fear that many women relate.

            Society moving forward is a nice idea, women feeling safe is much better one and attitudes like yours are part of the reason women generally do not feel safe. Deepfakes are not freeing at all.

            • @[email protected]
              link
              fedilink
              104 months ago

              Literally they have a rational fear of being killed by their male dates and it’s a commonly known and accepted fear that many women relate.

              No joke, stop dating shitty men.

              I get that’s too difficult for a lot of them, though.

              • @[email protected]
                link
                fedilink
                24 months ago

                How do you identify shitty men? They don’t wear labels. In fact they do their best to hide their shitty behavior at first.

                • @[email protected]
                  link
                  fedilink
                  64 months ago

                  Learn from your experiences, the experiences of others, and try to be a better judge of character. Don’t hang around bad crowds.

                  Unfortunately, most people can’t rise above peer pressure and think a group is always correct even if it’s comprised of shitty people.

                  What you don’t do is keep doing the same thing and expecting different results.

                  • @[email protected]
                    link
                    fedilink
                    14 months ago

                    Your responses are tone death as fuck.

                    The experiences of other woman is they thought they could trust certain men and they ended up dead. My experience is that men can massively change their behavior depending on the social setting. Just guys, guys and girls, or alone with a woman.

                    Besides sometimes its not even a bad crowd thing or judge of character. Sometimes some creep gets obsessed with a women and if she turns him down the “wrong” way, dude can flip and kill her.

                    There have been multiple INCEL killers. What did women do wrong that time, oh great wise one?

                    I encourage you to make women friends and actually listen to their experiences. It is not as simple as you make it out to be.

      • Aniki 🌱🌿
        cake
        link
        fedilink
        English
        124 months ago

        Poison the well is how we free ourselves from the vastness of the digital landscape that encompasses us. Make all data worthless.

    • @[email protected]
      link
      fedilink
      23
      edit-2
      4 months ago

      We gotta ban photo editing software too. Shit, we gotta ban computers entirely. Shit, now we have to ban electricity.

      • @[email protected]
        link
        fedilink
        23
        edit-2
        4 months ago

        I’m so tired of this “Don’t blame the tool” bs argument used to divert responsibility.

        Blame the fucking tool and restrict it.

        • @[email protected]
          link
          fedilink
          English
          214 months ago

          Why not blame the spread? You can’t ban the tool, it’s easily accessible software and that only requires easily accessible consumer hardware, and you can even semi easily train your own models using easily accessible porn on the Internet, so if you want to ban it outright, you’d need to ban the general purpose tool, all porn, and the knowledge to train image generation models. If you mean ban the online apps that sell the service on the cloud, I can get behind that, it would increase the bar to create them a little, but that is far from a solution.

          But, we already have laws against revenge porn and Internet harassment. I think the better and more feasible approach that doesn’t have far reaching free speech implications would be to simply put heavy penalties on spreading nudes images of people against their will, whether those images are real or fake. It’s harassment as revenge porn, and I didn’t see how it’s different if it’s a realistic fake. If there is major punishment for spreading these images then I think that will take care of discouraging the spread of the images for the vast majority of people.

              • @[email protected]
                link
                fedilink
                English
                64 months ago

                The companies that host and sell an online image to nude service using a tuned version of that tool specifically designed to convert images into nudes are definitely a business model.

                I agree it’s impractical and opens dangerous free speech problems to try and ban or regulate the general purpose software, but, I don’t have a problem with regulating for profit online image generation services that have been advertising the ability to turn images into nudes and have even been advertising their service on non porn sites. Regulating those will at least raise the bar a bit and ensure that there’s isn’t a for profit motive where capitalism will encourage it happening even more.

                We already have revenge porn laws that outlaw the spread of real nudes against someone’s will, I don’t see why the spread of fakes shouldn’t be outlaws similarly.

                • @[email protected]
                  link
                  fedilink
                  34 months ago

                  And I think if those companies can be identified as making the offending image, they should be help liable. IMO, you shouldn’t be able to use a photo without the permission of the person.

        • littleblue✨
          link
          fedilink
          74 months ago

          Blame the fucking tool and restrict it.

          I mean. It’s worked so well with you so far, why not?

    • @[email protected]
      link
      fedilink
      English
      214 months ago

      I think there’s a big difference between creating them and spreading them, and putting punishments on spreading nudes against someone’s will, real or fake is a better 3rd option. The free speech implications of banning software that’s capable of creating them is too broad and fuzzy, but I think that putting harsh penalties on spreading them on the grounds of harassment would be clear cut and effective. I didn’t see a big difference in between spreading revenge porn and deep fakes and we already have laws against spreading revenge porn.

    • @[email protected]
      link
      fedilink
      64 months ago

      With ai and digital art… What is real? What is a person? What is a cartoon or a similar but not same likeness? In some cases what even is nudity? How old is an ai image? How can anything then be legal or illegal?

      • @[email protected]
        cake
        link
        fedilink
        16
        edit-2
        4 months ago

        Where did it say anything about a Ministry of Truth deciding what can be posted online? Making it illegal and having a 3rd party decide if every post is allowed are two very different things

        If it’s illegal then there are ramifications for the platform, the user posting it, and the tool that created it.

        Content moderation is already a thing so it’s nothing new. Just one more thing on the list to check for when a post is reported

        • @[email protected]
          link
          fedilink
          24 months ago

          Making it illegal and having a 3rd party decide if every post is allowed are two very different things

          Depends on the scale. If you’re a black man in the South in 1953, having a 3rd party decide whether you can do something means you can’t do that thing.

          I’m not speaking to this particular topic, just saying in general 3rd parties can be corrupted. It’s not a foolproof solution or even always a good idea.

          • @[email protected]
            cake
            link
            fedilink
            2
            edit-2
            4 months ago

            I agree. It’s a terrible idea for many reasons. The fact that we can’t trust something like that to run in good faith is among the top of those reasons.

            The comment I was responding to was saying this proposed law would strip our ability to speak our mind because it would create a new 3rd party group that would validate each post before allowing them online.

            I was pointing out that making specific content illegal is not the same as having every post scrutinized before it goes live.

          • @[email protected]
            cake
            link
            fedilink
            54 months ago

            Well, you’re about 20 years too late. It has already started

            See any of the tor sites for examples of what is currently filtered out of the regular internet. It even gets your google account permanently banned if you log in via the tor browser

      • @[email protected]
        link
        fedilink
        44 months ago

        Yeah, sorry - I disagree on every level with your take.

        I am also convinced that at least the LLMs will soon destroy themselves, due to the simple fact that “garbage in, garbage out”.

    • @[email protected]
      link
      fedilink
      34 months ago

      It’s not a serious issue at all.

      Of course, if you’re the kind of greedy/lazy person who wants to make money off of pictures of their body, you’re going to have to find a real job.