• Bloonface@kbin.social
          link
          fedilink
          arrow-up
          5
          ·
          2 years ago

          Yeah but you can tell from the context that search results are just a list of random web pages that maybe what Google says is bollocks.

          Google gives you a bunch of results and says “here, look at these”. LLMs confidently tell you things that they may have simply made up and present them as if they’re real.

        • noodlejetski@kbin.social
          link
          fedilink
          arrow-up
          4
          ·
          edit-2
          2 years ago

          Google actually pulls results from web pages.

          you know how some smartphone keyboards predict the next word that you’re going to use, and you can form a comprehensible sentence that sometimes even makes sense by simply tapping the next word on the prediction bar over and over? that’s what those language models do. they don’t actually search for anything, they just create sequences of words that sound probable.

    • codus@leby.dev
      link
      fedilink
      English
      arrow-up
      3
      ·
      2 years ago

      I’d use some sort of generative “find on page” or “summarize page” where I could have a quick Q/A without needing to read a long article.