• @[email protected]
    link
    fedilink
    English
    61
    edit-2
    8 months ago

    Deepfakes of an actual child should be considered defamatory use of a person’s image; but they aren’t evidence of actual abuse the way real CSAM is.

    Remember, the original point of the term “child sexual abuse material” was to distinguish images/video made through the actual abuse of a child, from depictions not involving actual abuse – such as erotic Harry Potter fanfiction, anime characters, drawings from imagination, and the like.

    Purely fictional depictions, not involving any actual child being abused, are not evidence of a crime. Even deepfake images depicting a real person, but without their actual involvement, are a different sort of problem from actual child abuse. (And should be considered defamatory, same as deepfakes of an adult.)

    But if a picture does not depict a crime of abuse, and does not depict a real person, it is basically an illustration, same as if it was drawn with a pencil.

    • Uranium3006
      link
      fedilink
      108 months ago

      Remember, the original point of the term “child sexual abuse material” was to distinguish images/video made through the actual abuse of a child, from depictions not involving actual abuse – such as erotic Harry Potter fanfiction, anime characters, drawings from imagination, and the like.

      although that distinction lasted about a week before the same bad actors who cancel people over incest fanfic started calling all the latter CSEM too

      • @[email protected]
        link
        fedilink
        English
        11
        edit-2
        8 months ago

        As a sometime fanfic writer, I do notice when fascists attack sites like AO3 under a pretense of “protecting children”, yes.

        • Uranium3006
          link
          fedilink
          68 months ago

          And it’s usually fascists, or at least people who may not consider themselves as such but think and act like fascists anyways.

    • @[email protected]
      link
      fedilink
      English
      38 months ago

      Add in an extra twist. Hopefully if the sickos are at least happy with AI stuff they won’t need “real”

      Sadly, a lot of it does evolve from wanting to “watch” to wanting to do

      • @[email protected]
        link
        fedilink
        English
        98 months ago

        Sadly, a lot of it does evolve from wanting to “watch” to wanting to do

        This is the part where I disagree and I would love ppl to prove me wrong. Because whether this is true or false, it will probably be the deciding factor in allowing or restricting “artificial CSAM”.

      • @[email protected]
        link
        fedilink
        English
        48 months ago

        Sadly, a lot of it does evolve from wanting to “watch” to wanting to do

        Have you got some source about this ?

        • JohnEdwa
          link
          fedilink
          English
          3
          edit-2
          8 months ago

          Some people are sadists and rapists, yes, regardless of what age group they’d want to do it with.