Kill me now.

      • Riven@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        43
        ·
        1 year ago

        I tried the same Ai and asked it to provide a list of 20 things, it only gave me 5. I asked for the rest and it also apologized and then provided the rest. It’s weird that it stumbles at first but is able to see it’s error and fix it. I wonder if it’s a thing that it ‘learned’ from the data set. People not correctly answering prompts the first time.

        • webghost0101@sopuli.xyz
          link
          fedilink
          arrow-up
          10
          ·
          1 year ago

          Something else i also encounter with gpt4 a lot is asking “why did you do x or y” as a general curiosity of learning how it handles the task.

          Almost every time it apologizes and does a fully redo avoiding x or y

        • Gabu@lemmy.worldBannedBanned from community
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          Might be an intentional limitation to avoid issues like the “buffalo” incident with GPT3 (it would start leaking information it shouldn’t after repeating a word too many times).

    • nucleative@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Normally it ends the conversation at this point and refuses to answer any thing else, disabling the text box. At least it let you try again!