Julia, 21, has received fake nude photos of herself generated by artificial intelligence. The phenomenon is exploding.

“I’d already heard about deepfakes and deepnudes (…) but I wasn’t really aware of it until it happened to me. It was a slightly anecdotal event that happened in other people’s lives, but it wouldn’t happen in mine”, thought Julia, a 21-year-old Belgian marketing student and semi-professional model.

At the end of September 2023, she received an email from an anonymous author. Subject: "Realistic? “We wonder which photo would best resemble you”, she reads.

Attached were five photos of her.

In the original content, posted on her social networks, Julia poses dressed. In front of her eyes are the same photos. Only this time, Julia is completely naked.

Julia has never posed naked. She never took these photos. The Belgian model realises that she has been the victim of a deepfake.

  • @[email protected]
    link
    fedilink
    23
    edit-2
    4 months ago

    We gotta ban photo editing software too. Shit, we gotta ban computers entirely. Shit, now we have to ban electricity.

    • @[email protected]
      link
      fedilink
      23
      edit-2
      4 months ago

      I’m so tired of this “Don’t blame the tool” bs argument used to divert responsibility.

      Blame the fucking tool and restrict it.

      • @[email protected]
        link
        fedilink
        English
        214 months ago

        Why not blame the spread? You can’t ban the tool, it’s easily accessible software and that only requires easily accessible consumer hardware, and you can even semi easily train your own models using easily accessible porn on the Internet, so if you want to ban it outright, you’d need to ban the general purpose tool, all porn, and the knowledge to train image generation models. If you mean ban the online apps that sell the service on the cloud, I can get behind that, it would increase the bar to create them a little, but that is far from a solution.

        But, we already have laws against revenge porn and Internet harassment. I think the better and more feasible approach that doesn’t have far reaching free speech implications would be to simply put heavy penalties on spreading nudes images of people against their will, whether those images are real or fake. It’s harassment as revenge porn, and I didn’t see how it’s different if it’s a realistic fake. If there is major punishment for spreading these images then I think that will take care of discouraging the spread of the images for the vast majority of people.

            • @[email protected]
              link
              fedilink
              English
              64 months ago

              The companies that host and sell an online image to nude service using a tuned version of that tool specifically designed to convert images into nudes are definitely a business model.

              I agree it’s impractical and opens dangerous free speech problems to try and ban or regulate the general purpose software, but, I don’t have a problem with regulating for profit online image generation services that have been advertising the ability to turn images into nudes and have even been advertising their service on non porn sites. Regulating those will at least raise the bar a bit and ensure that there’s isn’t a for profit motive where capitalism will encourage it happening even more.

              We already have revenge porn laws that outlaw the spread of real nudes against someone’s will, I don’t see why the spread of fakes shouldn’t be outlaws similarly.

              • @[email protected]
                link
                fedilink
                34 months ago

                And I think if those companies can be identified as making the offending image, they should be help liable. IMO, you shouldn’t be able to use a photo without the permission of the person.

      • littleblue✨
        link
        fedilink
        74 months ago

        Blame the fucking tool and restrict it.

        I mean. It’s worked so well with you so far, why not?