• 0 Posts
  • 80 Comments
Joined 2 years ago
cake
Cake day: September 5th, 2023

help-circle
  • Full disclosure - my background is in operations (think IT) not AI research. So some of this might be wrong.

    What’s marketed as AI is something called a large language model. This distinction is important because AI implies intelligence - where as a LLM is something else. At a high level LLMs are using something called “tokens” to break apart natural language into elements that a machine can understand, and then recombining those tokens to “create” something new. When a LLM is creating output it does not know what it is saying - it knows what token statistically comes after the token(s) it has generated already.

    So to answer your question. An AI can hallucinate because it does not know the answer - its using advanced math to know that the period goes at the end of the sentence. and not in the middle.









  • Not dissimilar - my three steps.

    1. Ran away from vista.
    2. Get a job at Microsoft and figured I should learn how to use a core product again (Windows 10).
    3. Dual boot for years (you never know when you will need to wake up the windows for some random task), until Win 11 and recall…