• @[email protected]
    link
    fedilink
    English
    488 months ago

    The issue with direct LLM integration with web search is: They serve two different purposes. I dont search for things and want a GPT response. Likewise, I dont go to cahtgpt and want search results.

    It might seem like a weird distinction but I use them differently and when you mush them together they become less useful overall.

    Posting an error message into search may or may not get me a root cause or fix, but pasting it into chatgpt will very likely get me on the right track very quickly. Searching for a product I know exists is a pita on chat GPT, but a web search will pull it up pretty quickly.

    If I search for a product, I absolutely DO NOT WANT A GIANT WALL OF GPT BULLSHIT before meaningful search results.

    They are different products and have different use-cases. Stop trying to blend them! /rant

    • @[email protected]
      link
      fedilink
      English
      18 months ago

      It’s just a different way of searching. You can just say “give me links only”

      It’ll take time for mass adaptation

    • @[email protected]
      link
      fedilink
      English
      08 months ago

      Yet I feel like there are uses for a blend:

      BinGPT took a list of restaurants gave me all their hours and then formatted it into a nice markdown table for me.

      Only issue is that at least one of the restaurants had the wrong hours (though I believe this was because I included notes with each restaurant and they confused it)

      Still, it was nice not having to do 20 inividual searches and do the formatting manually