• @[email protected]
    link
    fedilink
    English
    501 year ago

    If the man did not distribute the pictures, how did the government find out? Did a cloud service rat him out? Or spyware?

    • @[email protected]
      link
      fedilink
      English
      511 year ago

      My guess would be he wasn’t self hosting the AI network so the requests were going through a website.

        • @[email protected]
          link
          fedilink
          English
          51 year ago

          ChatGPT can be tricked into giving IED instructions if you ask the right way. So it could be a similar situation.

        • BreakDecks
          link
          fedilink
          English
          21 year ago

          Why should it have that? Stable Diffusion websites know that most of their users are interested in NSFW content. I think the idea is to turn GPUs into cash flow, not to make sure that it is all wholesome.

          I suppose they could get some kind of sex+children detector going for all generated image, but you’re going to have to train that model on something, so now it’s a chicken and egg problem.

    • @[email protected]
      link
      fedilink
      English
      8
      edit-2
      1 year ago

      He was found extorting little girls with nude pics he generated of them.

      Edit: So I guess he just generated them. In that case, how’d they become public? I guess this is the problem if you don’t read the article.

      • @[email protected]
        link
        fedilink
        English
        341 year ago

        Earlier this month, police in Spain launched an investigation after images of underage girls were altered with AI to remove their clothing and sent around town. In one case, a boy had tried to extort one of the girls using a manipulated image of her naked, the girl’s mother told the television channel Canal Extremadura.

        That was another case in Spain. Not the guy in Korea. The person in Korea didn’t distribute the images.

      • @[email protected]
        link
        fedilink
        English
        71 year ago

        Why the fuck isn’t that the headline? Jesus, that’s really awful and changes everything.

        • Lowlee Kun
          link
          fedilink
          English
          181 year ago

          Because that was another case. Extortion and blackmail (and in this case would count as production of cp as would be the case if you would draw after a real child) are already illegal. On this case we simply dont have enough information.