• @[email protected]
    link
    fedilink
    English
    194 months ago

    Does this mean the AI was trained on CP material? How else would it know how to do this?

    • @[email protected]
      link
      fedilink
      English
      374 months ago

      It would not need to be trained on CP. It would just need to know what human bodies can look like and what sex is.

      AIs usually try not to allow certain content to be produced, but it seems people are always finding ways to work around those safeguards.

      • @deraceituno
        link
        English
        64 months ago

        Training is how it knows it…

        • @[email protected]
          link
          fedilink
          English
          444 months ago

          You can ask it to make an image of a man made of pizza. That doesn’t mean it was trained on images of that.

          • @[email protected]
            link
            fedilink
            English
            44 months ago

            But it means that it was trained on people and on pizza. If it can produce CSAM, it means it had access to pictures of naked minors. Even if it wasn’t in a sexual context.

            • @[email protected]
              link
              fedilink
              English
              84 months ago

              Minors are people. It knows what clothed people of all ages look like. It also knows what naked adults look like. The whole point of AI is that it can fill in the gaps and create something it wasn’t trained on. Naked + child is just a simple equation for it to solve

        • @[email protected]
          link
          fedilink
          English
          104 months ago

          The whole point of those generative models that they are very good at blending different styles and concepts together to create coherent images. They’re also really good at editing images to add or remove entire objects.

        • @[email protected]
          link
          fedilink
          English
          6
          edit-2
          4 months ago

          I think @[email protected] meant was the AI could be trained on what sex is and what children are at different points. Then a user request could put those two concepts together.

          But as the replies I got show, there were multiple ways this could have got accomplished. All I know is AI needs to go to jail.

      • @[email protected]
        link
        fedilink
        English
        34 months ago

        AIs usually try not to allow certain content to be produced, but it seems people are always finding ways to work around those safeguards.

        Local model go brrrrrr

    • @[email protected]
      link
      fedilink
      English
      24 months ago

      Likely yes, and even commercial models have an issue with CSAM leaking into their datasets. The scummiest of all of them likelyget one offline model, then add their collection of CSAM to it.