• @[email protected]
    link
    fedilink
    English
    726 months ago

    Not quite, it is an intelligent summary. More advanced models would realize that is bad advice and not give it. However for search results, google uses a lightweight, dumber model (flash) which does not realize this.

    I tested with rock example, albiet on a different search engine (kagi). The base model gave the same answer as google (ironically based on articles about google’s bad results, it seems it was too dumb to realize that the quotations in the articles were examples of bad results, not actual facts), but the more advanced model understood and explained how the bad advice had been spreading around and you should not follow it.

    It isn’t a hallucination though, you’re right about that