Am I missing something? The article seems to suggest it works via hidden text characters. Has OpenAI never heard of pasting text into a utf8 notepad before?

    • @[email protected]
      link
      fedilink
      English
      81 month ago

      Few years ago the output of GPT was complete gibberish and few years before that even producing such gibberish would’ve been impressive.

      It doesn’t take anyone’s job untill it does.

      • @[email protected]
        link
        fedilink
        English
        71 month ago

        Few years ago the output of GPT was complete gibberish

        That’s not really true. Older GPTs were already really good. Did you ever see SubredditSimulator? I’m pretty sure that first came around like 10 years ago.

        • archomrade [he/him]
          link
          fedilink
          English
          31 month ago

          They were good for about a paragraph, maybe less.

          As soon as they reached the attention limit they started talking gibberish.

        • @[email protected]
          link
          fedilink
          English
          21 month ago

          The first time I saw text written by GPT it all seemed alright at first glance but once you started to actually read it was immediately obvious it had no idea what it was talking about. It was grammatically correct nonsense.

    • Angry_Autist (he/him)
      link
      fedilink
      English
      51 month ago

      LLMs aren’t going to take coding jobs, there are specific case AIs being trained for that. They write code that works but does not make sense to human eyes. It’s fucking terrifying but EVERYONE just keeps focusing on the LLMS.

      There are at least 2 more dangerous model types being used right now to influence elections and manipulate online spaces and ALL everyone cares about is their fucking parrot bots…