A.I. Is Making the Sexual Exploitation of Girls Even Worse::Parents, schools and our laws need to catch up to technology, fast.

  • VaultBoyNewVegas
    link
    fedilink
    English
    469 months ago

    Ai is definitely making things worse. When I was at school there was no tool for creating a deep fake of girls, now boys sneek a pic and use an app to undress them. That then gets shared, girls find out and obviously become distressed. Without ai boys would have to either sneak into toilets/changing rooms or psychically remove girls clothes. Neither of which was happening at such a large degree in a school before as it would create a shit show.

    Also most jurisdictions don’t actually have strict AI laws yet which is making it harder for authorities to deal with. If you genuinely believe that AI isn’t at fault here then you’re ignorant of what’s happening around the world.

    https://www.theguardian.com/technology/2024/feb/29/clothoff-deepfake-ai-pornography-app-names-linked-revealed That’s an article about one company that provides an app for deep fakes. It’s a shell corp so not easy to shut down through the law and arrest people, also hundreds of teenage girls have been affected by others creating non consensual nudes of them.

    • @[email protected]
      link
      fedilink
      English
      429 months ago

      When I was a kid I used to draw dirty pictures and beat off to them. AI image creation is a paint brush.

      I very much disagree with using it to make convincing deepfakes of real people, but I struggle with laws restricting its use otherwise. Are images of ALL crimes illegal, or just the ones people dislike? Murder? I’d call that the worst crime, but we sure do love murder images.

      • @[email protected]
        link
        fedilink
        English
        139 months ago

        You could do everything before, that’s true, but you needed knowledge/time/effort, so the phenomenon was very limited. Now that it’s easy, the number of victims (if we can call them that) is huge. And that changes things. It’s always been wrong. Now it’s also a problem

        • BringMeTheDiscoKing
          link
          fedilink
          English
          109 months ago

          This is right. To do it before you had to be a bit smart and motivated. That’s a smaller cross section of people. Now any nasty fuck with an app on their phone can bully and harass their classmates.

          • @[email protected]
            link
            fedilink
            English
            1
            edit-2
            9 months ago

            The time / effort here is very similar, both methods have their own quirks that make them better or worse than the other, both methods however are very fast and very easy to do.

            You’re lying to yourself and you must know that, or you’re just making false assumptions. But let’s go through this step by step.

            Now with a “nudify” app:

            • install a free app
            • snap a picture
            • click a button
            • you have a fake nude

            Before:

            • snap a picture
            • go to a PC
            • buy Photoshop for $ 30.- / month (sure) or search for a pirated version, dowload a crack, install it and pray that it works
            • find a picture that fits with the person you’ve photographed
            • read a guide online
            • try to do it
            • you have (maybe) a bad fake nude

            That’s my fist point. Second:

            the result should just be ignored as far as personal feelings go

            Tell it to the girl who killed herself because everyone thought that her leaked “nudes” were actual nudes. People do not work how you think they do.

            You don’t need special laws to file for harassment or even possible blackmail. This whole thing is just overblown fake hysteria and media panic because “AI” is such a hot topic at the moment

            True, you probably don’t need new laws. But the emergence of generative AI warrants a public discussion about its consequences. There IS a lot of hype around AI, but generative AI is here and is having/will have a tangible impact. You can be an AI skeptic but also recognise that some things are actually happening.

            In a few years this will all go away again because no one really cares that much and real leaked nudes will possibly even declared a deepfake to confuse people.

            For this to happen, things will have to get WAY worse before they get better. And that means people will suffer and possibly kill themselves, like it’s already happened. Are we ready to let that happen?

            Also we’re talking only about fake nudes, but if you think about the fact that GenAI is going to spread throughout every aspect of our world, your point becomes even more absurd

    • @xePBMg9
      link
      English
      17
      edit-2
      9 months ago

      Photoshop has existed for quite some time. Take photo, google naked body, paste face on body. The ai powered bit just makes it slightly easier. I don’t want a future where my device is locked down and surveiled to the point I can’t install what I want on it. Neither should the common man be excluded from taking advantage of these tools. This is a people problem. Maybe culture needs to change. Limit phone use in schools. Technical solutions will likely only bring worse problems. There are probably no lazy solutions here. This is not one of those problems you can just hand over to some company and tell them to figure it out.

      Though I could get behind making it illegal to upload and store someone’s likeness unless explicit consent was given. That is long overdue. Though some big companies would not get behind that. So it would be a hard sell. In fact, I would like all personal data be illegal to store, trade and sell.

      • This is fine🔥🐶☕🔥
        link
        fedilink
        English
        149 months ago

        Photoshop has existed for quite some time. Take photo, google naked body, paste face on body. The ai powered bit just makes it slightly easier.

        Slightly easier? That’s a one hell of an understatement. Have you ever used Stable Diffusion?

      • @[email protected]
        link
        fedilink
        English
        39 months ago

        Though I could get behind making it illegal to upload and store someone’s likeness unless explicit consent was given. That is long overdue. Though some big companies would not get behind that.

        But many big companies would love it. Basically, it turns a likeness into intellectual property. Someone who pirates a movie, would also be pirating likenesses. The copyright industry would love it; the internet industry not so much,

        Licensing their likeness - giving consent after receiving money, if you prefer - would also be a new income stream for celebrities. They could license their likeness for any movie or show and get a pretty penny without having to even show up. They would just be deep-faked onto some skilled, low-paid double.

    • @[email protected]
      link
      fedilink
      English
      49 months ago

      AI is a genie that can’t be put back into its bottle.

      Now that it exists you can make it go away with laws. If you tried, at best all you’d do is push it to sketchy servers hosted outside of the jurisdiction of whatever laws you passed.

      AI is making it easier than it was before to (ab)use someone by creating a nude (or worse) with their face. That is a genuine problem, it existed before AI, and it’s getting worse.

      I’m not saying you have like it, but if you think laws will make that unavailable you’re dreaming.