• @[email protected]
    link
    fedilink
    English
    673 months ago

    And trust me, these generated images are getting scarily good.

    I have to agree, I would not be able to spot a single one of them as fake. They look really convincingly authentic IMO.

    • Flying SquidOP
      link
      fedilink
      English
      1463 months ago

      Stalin famously ordered people he had killed erased from photos.

      Imagine what current and future autocratic regimes will be able to achieve when they want to rewrite their histories.

        • @[email protected]
          link
          fedilink
          English
          263 months ago

          Probably just because some people really like Stalin, and have become convinced his accounts are the truthful ones and everyone else lies about him.

          • @[email protected]
            link
            fedilink
            English
            53 months ago

            That’s a scary thought!! But all kinds of crazy exist, and I mean people have to be literally crazy to want to live under a regime like Stalin made.

        • @[email protected]
          link
          fedilink
          English
          143 months ago

          “Photoshopping” something bad existed for a long time at this point.AI generated images doesn’t really change anything other then the entire photo being fake instead of just a small section.

          • @[email protected]
            link
            fedilink
            English
            233 months ago

            I’d disagree. It takes, now, zero know-how to convincingly create a false image. And it takes zero work. So where one photo would take one person a decent amount of time to convincingly pull off, now one person can create 100 images or more in that time, each one a potential time bomb that will go off when it starts getting passed around as evidence of something. And there are uncountable numbers of bad actors on the internet trying to cause a ruckus. This just increased their chances of succeeding at least 100-fold, and opened the access to many, many others who might just do it accidentally, for a joke, or who always wanted to create waves but didn’t have the photoshop skills necessary.

          • Flying SquidOP
            link
            fedilink
            English
            133 months ago

            It changes a lot. Good Photoshopping skills would not create the images as shown in the article.

            • Aniki 🌱🌿
              cake
              link
              fedilink
              English
              43 months ago

              Yeah some of these would be like 100 layer creations if someone was doing it themselves in photoshop – It would take a professional or near-professional level of skills.

          • @[email protected]
            link
            fedilink
            English
            53 months ago

            The easy and speed with which AI created photos, of a quality most photoshoppers could only dream, can be created of does very much change everything.

        • StarkWolf
          link
          fedilink
          93 months ago

          With AI video also getting increasingly impressive and believable, I worry that we will soon live in a world where you could have actual video evidence of a murder, and that evidence being dismissed or cast into doubt because of how easy, or supposedly how easy it would be to fake.

          • @[email protected]
            link
            fedilink
            English
            33 months ago

            Absolutely, only video from trusted sources can be used. But isn’t that already the case?

          • FaceDeer
            link
            fedilink
            13 months ago

            Better than having people get convicted based on fake evidence, though.

            • StarkWolf
              link
              fedilink
              23 months ago

              I think they are both equally scary. I’m imagining cases where photo and video evidence have played major roles in proving police abuses of power for example. We will certainly have an onslaught of people making faking evidence of all sorts of things to push a political narrative, but equally in any politicized narrative, any politically inconvenient photos or videos of real things that really happened might be swept under the rug as “someone probably just faked that for political gain.” Sure you could have an investigation to look into the authenticity of the evidence, or look at other forensic evidence, but probably only if you can afford to have such an investigation done, or enough public attention gets drawn to it. I fear we are reaching a scary time where, in a sense, reality will be whatever people want it to be, and we will increasingly be unable to trust anything we see as real with absolute certainty. We have been headed down this road for a very long time, but this will just make it much worse

      • magic_lobster_party
        link
        fedilink
        11
        edit-2
        3 months ago

        Digital image editing has been really good for this kind of stuff for quite a while. Now it’s even easier with content aware fill.

        Unless you’re the PR manager for the British Royal family. Then you somehow lack the basic skills to make convincing edits.

      • Cosmic Cleric
        link
        fedilink
        English
        3
        edit-2
        3 months ago

        Honestly, it looks like the picture on the left is fake, like the guy was inserted into it. Just look at his outline, compared with the rest of the background.

        (I’m no Stalin fan, just commenting on the picture itself.)

      • @[email protected]
        link
        fedilink
        English
        33 months ago

        I can Imagine such regimes novadays to develop some sort of cryptographic photo attestation, so any photo not signed by them is going to be shown as untrusted, regardless if it’s fake or not. And all the code from processor to camera app would need to be approved by their servers in order to get a sign.

        Oh wait! Our great friends at Adobe, Intel, Google and Microsoft are already working on just that: https://c2pa.org/