• @[email protected]
      link
      fedilink
      English
      48 months ago

      Except that the information it gives you is often objectively incorrect and it makes up sources (this happened to me a lot of times). And no, it can’t do what a human can. It doesn’t interpret the information it gets and it can’t reach new conclusions based on what it “knows”.

      I honestly don’t know how you can even begin to compare an LLM to the human brain.