Rep. Joe Morelle, D.-N.Y., appeared with a New Jersey high school victim of nonconsensual sexually explicit deepfakes to discuss a bill stalled in the House.

  • @[email protected]
    link
    fedilink
    English
    111 months ago

    It’s not illegal to to work on, sell, or distribute the models. And making that illegal is what the first commenter said would be dangerous to do, since then regular people wouldn’t be able to compete with corporation’s abilities.

    Once the models and portable hardware are good enough, and it’s just a matter of time, I think you’re underestimating how ubiquitous it will become.

    Every teenage boy will have a pair of nudie glasses in the form of their smartphone running open source models, and you think they’re just going to not use them?

    • @[email protected]
      link
      fedilink
      English
      2
      edit-2
      11 months ago

      I think you again vastly overestimate how many people are going to run their own AI versus using a sanitized, policy-driven, managed platform version that’s cloud based (e.g. Dall-E and ChatGPT right now).

      It’s possible today (and usually better) to do a lot of things locally, but yet still almost everything routes through an app to a platform on your smartphone and the few remaining things that don’t route through a platform using your phone’s browser.

      • @[email protected]
        link
        fedilink
        English
        111 months ago

        When it becomes one click to see the chick across from you naked, tell me how many 16 year old boys won’t. You are far too naive to be having this conversation.

        • @[email protected]
          link
          fedilink
          English
          1
          edit-2
          11 months ago

          It’s not naive to think that corporations will continue to win the “AI” war. It’s actually pretty naive to think otherwise.

          I also dunno why you think that all of the resources in oss AI will focus their efforts on making it easy to generate excellent, likely already illegal deep fake porn of random teenagers in “one click”.

          I’ve been using oss for decades and almost nothing is that easy to do even when it could be. Why would people focus their efforts on this?

          Also also, I don’t get why you think that generating AI porn of people around you is:

          A) so much better than just watching the millions of hours of already available porn

          B) anything even remotely similar to “seeing someone naked”