• @[email protected]
    link
    fedilink
    English
    536 months ago

    Why would they not? There’s no way for such a system to know it’s AI generated unless there’s some metadata that makes it obvious. And even if it was, who’s to say the user wouldn’t want to see them in the results?

    This is a nothing issue. It’s not like this is being generated in response to a search, it’s something that already existed being returned as a result because there is assembly something that links it to the search.

    • Ricky Rigatoni
      link
      fedilink
      English
      216 months ago

      To put it bluntly: this is kind of like complaining a pencil drawing on a napkin showed up in the results.

    • @[email protected]
      link
      fedilink
      English
      66 months ago

      There’s no way for such a system to know it’s AI generated unless there’s some metadata that makes it obvious.

      I agree with your comment but just want to point out that AI-generated images actually often do contain metadata, usually describing the model and prompt used.

      • @[email protected]
        link
        fedilink
        English
        146 months ago

        By the time a user has shared them, 99% of the time all superfluous metadata has been stripped, for better or worse.

    • TheHarpyEagle
      link
      fedilink
      English
      06 months ago

      That’s fine for looking up cat pictures or porn, but many people are searching for information contained in images, and that is a problem. What if you were looking for a graph, a map, a blueprint, etc.? How do you discern the real from the fake? What if you click through and the image seems to come from a legit source that is also generated?