• Nora
    link
    fedilink
    English
    25
    edit-2
    4 months ago

    I had an idea when these first AI image generators started gaining traction. Flood the CSAM market with AI generated images( good enough that you can’t tell them apart.) In theory this would put the actual creators of CSAM out of business, thus saving a lot of children from the trauma.

    Most people down vote the idea on their gut reaction tho.

    Looks like they might do it on their own.

    • @[email protected]
      link
      fedilink
      English
      94 months ago

      My concern is why would it put them out of business? If we just look at legal porn there is already beyond huge amounts already created, and the market is still there for new content to be created constantly. AI porn hasn’t noticeably decreased the amount produced.

      Really flooding the market with CSAM makes it easier to consume and may end up INCREASING the amount of people trying to get CSAM. That could end up encouraging more to be produced.

      • Nora
        link
        fedilink
        English
        44 months ago

        The market is slightly different tho. Most CSAM is images, with Porn theres a lot of video and images.

    • @[email protected]
      link
      fedilink
      English
      64 months ago

      It’s also a victimless crime. Just like flooding the market with fake rhino horns and dropping the market price to a point that it isn’t worth it.

    • PirateJesus
      link
      fedilink
      English
      14 months ago

      It would be illegal in the United States. Artistic depictions of CSAM are illegal under the PROTECT act 2003.

      • @[email protected]
        link
        fedilink
        English
        4
        edit-2
        4 months ago

        And yet it’s out there in droves on mainstream sites, completely without issue. Drawings and animations are pretty unpoliced.