Julia, 21, has received fake nude photos of herself generated by artificial intelligence. The phenomenon is exploding.

“I’d already heard about deepfakes and deepnudes (…) but I wasn’t really aware of it until it happened to me. It was a slightly anecdotal event that happened in other people’s lives, but it wouldn’t happen in mine”, thought Julia, a 21-year-old Belgian marketing student and semi-professional model.

At the end of September 2023, she received an email from an anonymous author. Subject: "Realistic? “We wonder which photo would best resemble you”, she reads.

Attached were five photos of her.

In the original content, posted on her social networks, Julia poses dressed. In front of her eyes are the same photos. Only this time, Julia is completely naked.

Julia has never posed naked. She never took these photos. The Belgian model realises that she has been the victim of a deepfake.

  • @[email protected]
    link
    fedilink
    24 months ago

    Making it illegal and having a 3rd party decide if every post is allowed are two very different things

    Depends on the scale. If you’re a black man in the South in 1953, having a 3rd party decide whether you can do something means you can’t do that thing.

    I’m not speaking to this particular topic, just saying in general 3rd parties can be corrupted. It’s not a foolproof solution or even always a good idea.

    • @[email protected]
      cake
      link
      fedilink
      2
      edit-2
      4 months ago

      I agree. It’s a terrible idea for many reasons. The fact that we can’t trust something like that to run in good faith is among the top of those reasons.

      The comment I was responding to was saying this proposed law would strip our ability to speak our mind because it would create a new 3rd party group that would validate each post before allowing them online.

      I was pointing out that making specific content illegal is not the same as having every post scrutinized before it goes live.