• @[email protected]
      link
      fedilink
      55 months ago

      I was using the term pretty loosely there. It’s not psychopathic in the medical sense because it’s not human.

      As I see it it’s an alien semi-intelligence with no interest in pretty much any human construct, except as it can help it predict the next token. So, no empathy or guilt, but that’s not unusual or surprising.

    • @[email protected]
      link
      fedilink
      35 months ago

      That’s a part of it. Another part is that it looks for patterns that it can apply in other places, which is how it ends up hallucinating functions that don’t exist and things like that.

      Like it can see that English has the verbs add, sort, and climb. And it will see a bunch of code that has functions like add(x, y) and sort( list ) and might conclude that there must also be a climb( thing ) function because that follows the pattern of functions being verb( objects ). It didn’t know what code is or even verbs for that matter. It could generate text explaining them because such explanations are definitely part of its training, but it understands it in the same way a dictionary understands words or an encyclopedia understands the concepts contained within.