• @[email protected]
    link
    fedilink
    English
    5
    edit-2
    7 days ago

    I love how they chose the term “hallucinate” instead of saying it fails or screws up.

      • @[email protected]
        link
        fedilink
        English
        57 days ago

        A hallucination is a false perception of sensory experiences (sights, sounds, etc).

        LLMs don’t have any senses, they have input, algorithms and output. They also have desired output and undesired output.

        So, no, ‘hallucinations’ fits far worse than failure or error or bad output. However assigning the term ‘hallucinaton’ does serve the billionaires in marketing their LLMs as actual sentience.