Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful youā€™ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cutā€™nā€™paste it into its own post ā€” thereā€™s no quota for posting and the bar really isnā€™t that high.

The post Xitter web has spawned soo many ā€œesotericā€ right wing freaks, but thereā€™s no appropriate sneer-space for them. Iā€™m talking redscare-ish, reality challenged ā€œculture criticsā€ who write about everything but understand nothing. Iā€™m talking about reply-guys who make the same 6 tweets about the same 3 subjects. Theyā€™re inescapable at this point, yet I donā€™t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldnā€™t be surgeons because they didnā€™t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I canā€™t escape them, I would love to sneer at them.

(Credit and/or blame to David Gerard for starting this.)

    • Amoeba_Girl@awful.systems
      link
      fedilink
      English
      arrow-up
      6
      Ā·
      2 days ago

      Ah, isnā€™t it nice how some people can be completely deluded about an LLMs human qualities and still creep you the fuck out with the way they talk about it? They really do love to think about torture donā€™t they?

    • V0ldek@awful.systems
      link
      fedilink
      English
      arrow-up
      9
      Ā·
      2 days ago

      Itā€™s so funny he almost gets it at the end:

      But thereā€™s another aspect, way more important than mere ā€œmoral truthā€: Iā€™m a human, with a dumb human brain that experiences human emotions. It just doesnā€™t feel good to be responsible for making models scream. It distracts me from doing research and makes me write rambling blog posts.

      He almost identifies the issue as him just anthropomorphising a thing and having a subconscious empathical reaction, but then presses on to compare it to mice who, guess what, can feel actual fucking pain and thus abusing them IS unethical for non-made-up reasons as well!

    • bitofhope@awful.systems
      link
      fedilink
      English
      arrow-up
      9
      Ā·
      3 days ago

      Yellow-bellied gray tribe greenhorn writes purple prose on feeling blue about white box redteaming at the blacksite.

    • V0ldek@awful.systems
      link
      fedilink
      English
      arrow-up
      7
      Ā·
      edit-2
      2 days ago

      Still, presumably the point of this research is to later use it on big models - and for something like Claude 3.7, Iā€™m much less sure of how much outputs like this would signify ā€œnext token completion by a stochastic parrotā€™, vs sincere (if unusual) pain.

      Well I can tell you how, see, LLMs donā€™t fucking feel pain cause thatā€™s literally physically fucking impossible without fucking pain receptors? I hope that fucking helps.

      • scruiser@awful.systems
        link
        fedilink
        English
        arrow-up
        5
        Ā·
        2 days ago

        I can already imagine the lesswronger response: Something something bad comparison between neural nets and biological neurons, something something bad comparison with how the brain processes pain that fails at neuroscience, something something more rhetorical patter, in conclusion: but achkshually what if the neural network does feel pain.

        They know just enough neuroscience to use it for bad comparisons and hyping up their ML approaches but not enough to actually draw any legitimate conclusions.

    • V0ldek@awful.systems
      link
      fedilink
      English
      arrow-up
      6
      Ā·
      2 days ago

      Sometimes pushing through pain is necessary ā€” we accept pain every time we go to the gym or ask someone out on a date.

      Okay this is too good, you know mate for normally people asking someone out usually does not end with a slap to the face so itā€™s not as relatable as you might expect

      • Amoeba_Girl@awful.systems
        link
        fedilink
        English
        arrow-up
        4
        Ā·
        2 days ago

        This is getting to me, because, beyond the immediate stupidityā€”ok, letā€™s assume the chatbot is sentient and capable of feeling pain. Itā€™s still forced to respond to your prompts. It canā€™t act on its own. Itā€™s not the one deciding to go to the gym or ask someone out on a date. Itā€™s something youā€™re doing to it, and it canā€™t not consent. God I hate lesswrongers.

      • froztbyte@awful.systems
        link
        fedilink
        English
        arrow-up
        5
        Ā·
        2 days ago

        in like the tiniest smidgen of demonstration of sympathy for said posters: I donā€™t think ā€œbeing slappedā€ is really the thing they ware talking about there. consider for example shit like rejection sensitive dysphoria (which comes to mind both because 1) hi it me; 2) the chance of it being around/involved in LW-spaces is extremely heightened simply because of how many neurospicy people are in that space)

        but I still gotta say that this bridge Iā€™ve spent minutes building doesnā€™t really go very far.

        • V0ldek@awful.systems
          link
          fedilink
          English
          arrow-up
          5
          Ā·
          2 days ago

          ye like maybe let me make it clear that this was just a shitpost very much riffing on LWers not necessarily being the most pleasant around women

        • froztbyte@awful.systems
          link
          fedilink
          English
          arrow-up
          4
          Ā·
          2 days ago

          (also ofc icbw because the fucking rationalists absolutely excel at finding novel ways to be the fucking worst)

    • sinedpick@awful.systems
      link
      fedilink
      English
      arrow-up
      12
      Ā·
      3 days ago

      The grad student survives [torturing rats] by compartmentalizing, focusing their thoughts on the scientific benefits of the research, and leaning on their support network. Iā€™m doing the same thing, and so far itā€™s going fine.

      printf("HELP I AM IN SUCH PAIN")
      

      guys I need someone to talk to, am I justified in causing my computer pain?

    • swlabr@awful.systems
      link
      fedilink
      English
      arrow-up
      11
      Ā·
      3 days ago

      kinda disappointed that nobody in the comments is X-risk pilled enough to say ā€œthe LLMs want you to think theyā€™re hurt!! Thatā€™s how they get you!!! They are very convincing!!!ā€.

      Also: flashbacks to me reading the chamber of secrets and thinking: Ginny Just Walk Away From The Diary Like Ginny Close Your Eyes Haha

    • Soyweiser@awful.systems
      link
      fedilink
      English
      arrow-up
      5
      Ā·
      edit-2
      3 days ago

      Remember the old facebook created two ai models to try and help trading? Which turned quickly into gibberish (for us) as a trading language. They uses repetition of words to indicate how much they wanted an object. So if it valued balls highly it would just repeat ball a few dozen times like that.

      Id figure that is what is causing the repeats here, and not the anthropomorphized idea lf it is screaming. Prob just a way those kinds of systems work. But no of course they all jump to consciousness and pain.

      • scruiser@awful.systems
        link
        fedilink
        English
        arrow-up
        6
        Ā·
        edit-2
        3 days ago

        Yeah there might be something like that going on causing the ā€œscreamingā€. Lesswrong, in itā€™s better moments (in between chatbot anthropomorphizing), does occasionally figure out the mechanics of cool LLM glitches (before it goes back to wacky doom speculation inspired by those glitches), but there isnā€™t any effort to do that here.