‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity::It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distributed because of advances in artificial intelligence.

    • @[email protected]
      link
      fedilink
      English
      167 months ago

      Not all nudity is but there is no non-sexual reason to use AI to undress someone without consent

      • @[email protected]
        link
        fedilink
        English
        77 months ago

        The question on consent is something I’m trying to figure out. Do you need consent to alter an image that is available in a public space? What if it was you who took the picture of someone in public?

        • @[email protected]
          link
          fedilink
          English
          77 months ago

          Keep in mind there is a difference between ethical and legal standards. Legally you may not need consent to alter a photo of someone unless it was a copyrighted work possibly. But ethically it definitely requires consent, especially in this context

          • @[email protected]
            link
            fedilink
            English
            37 months ago

            The difference between legal and ethical is one could get you fined or imprisoned and the other would make a group of people not like you.

    • @[email protected]
      link
      fedilink
      English
      2
      edit-2
      7 months ago

      Just because something shouldn’t be doesn’t mean It won’t be. This is reality and we can’t just wish something to be true. You saying it doesn’t really help anything.

      • @[email protected]
        link
        fedilink
        English
        -1
        edit-2
        7 months ago

        Whoooooosh.

        In societies that have a healthy relationship with the human body, nudity is not considered sexual. I’m not just making up fantasy scenarios.