I followed these steps, but just so happened to check on my mason jar 3-4 days in and saw tiny carbonation bubbles rapidly rising throughout.

I thought that may just be part of the process but double checked with a Google search on day 7 (when there were no bubbles in the container at all).

Turns out I had just grew a botulism culture and garlic in olive oil specifically is a fairly common way to grow this bio-toxins.

Had I not checked on it 3-4 days in I’d have been none the wiser and would have Darwinned my entire family.

Prompt with care and never trust AI dear people…

  • @[email protected]
    link
    fedilink
    English
    215 months ago

    I am saying that coining it as a term was stupid and intended to make it sound intelligent when it isn’t.

    • David GerardOPM
      link
      fedilink
      English
      115 months ago

      oh definitely, it’s fucking terrible question-begging. I’d like to know when it traces back to, and how good faith it was or wasn’t

      • @[email protected]
        link
        fedilink
        English
        45 months ago

        It originally comes from false positives in computer vision afaik, where it makes some sense as the model is “seeing” things that aren’t in the image.

    • 𝘋𝘪𝘳𝘬
      link
      fedilink
      English
      35 months ago

      Of course is the term stupid. Neither is an LLM an AI, nor is any AI in the current state intelligent. In the end it all boils down to being answer machines. Complex ones, but still far away from anything even remotely being am AI.