Kill me now.

      • @[email protected]
        link
        fedilink
        438 months ago

        I tried the same Ai and asked it to provide a list of 20 things, it only gave me 5. I asked for the rest and it also apologized and then provided the rest. It’s weird that it stumbles at first but is able to see it’s error and fix it. I wonder if it’s a thing that it ‘learned’ from the data set. People not correctly answering prompts the first time.

        • @[email protected]
          link
          fedilink
          108 months ago

          Something else i also encounter with gpt4 a lot is asking “why did you do x or y” as a general curiosity of learning how it handles the task.

          Almost every time it apologizes and does a fully redo avoiding x or y

        • @[email protected]
          link
          fedilink
          18 months ago

          Might be an intentional limitation to avoid issues like the “buffalo” incident with GPT3 (it would start leaking information it shouldn’t after repeating a word too many times).

    • I Cast Fist
      link
      fedilink
      258 months ago

      I still want to know what the fucking fuck triggered “possible self harm” in your first question.

    • @[email protected]
      link
      fedilink
      English
      18 months ago

      Normally it ends the conversation at this point and refuses to answer any thing else, disabling the text box. At least it let you try again!