This may be an unpopular opinnion… Let me get this straight. We get big tech corporations to read the articles of the web and then summarize to me, the user the info I am looking for. Sounds cool, right? Yeah, except that why in the everloving duck would I trust Google, Microsoft, Apple or Meta to give me the correct info, unbiased and not curated? The past experiences all show that they will not do the right thing. So why is everyone so OK with what’s going on? I just heard that Google may intend to remove sources. Great, so it’s like trust me bro.

  • @[email protected]
    link
    fedilink
    English
    67 months ago

    I’ve had this argument with friends a lot recently.

    Them: it’s so cool that I can just ask chatgpt to summarise something and I can get a concise answer rather than googling a lot for the same thing.

    Me: But it gets things wrong all the time.

    Them: Oh I know so I Google it anyway.

    Doesn’t make sense to me.

    • @[email protected]
      link
      fedilink
      English
      117 months ago

      People like AI because searches are full of SEO spam listicles. Eventually they will make LLMs as ad-riddled as everything else.

      • @[email protected]
        link
        fedilink
        English
        37 months ago

        My specific point here was about how this friend doesn’t trust the results AND still goes to Google/others to verify, so he’s effectively doubled his workload for every search.

      • @[email protected]OP
        link
        fedilink
        English
        17 months ago

        Then why not use an ad-blocker? It’s not wise to think you’re getting the right information when you can’t verify the sources. Like I said, at least for me, the trust me bro aspect doesn’t cut it.

    • @[email protected]
      link
      fedilink
      English
      37 months ago

      We also get things wrong all the time. Would you double check info you got from a friend of coworker? Perhaps you should.

      • @[email protected]
        link
        fedilink
        English
        16 months ago

        I know how my friends and coworkers are likely to think. An LLM is far less predictable.

    • @[email protected]
      link
      fedilink
      English
      26 months ago

      This is why I do a lot of my Internet searches with perplexity.ai now. It tells me exactly what it searched to get the answer, and provides inline citations as well as a list of its sources at the end. I’ve never used it for anything in depth, but in my experience, the answer it gives me is typically consistent with the sources it cites.