Orbit is an LLM addon/extension for Firefox that runs on the Mistral 7B model. It can summarize a given webpage, YouTube videos and so on. You can ask it questions about stuff that’s on the page. It is very privacy friendly and does not require any account to sign up.

I personally tried it, and found it to be incredibly useful! I think this is going to be one of my long term addons along with uBlock Origin, Decentraleyes and so on. I would highly recommend checking this out!

  • DarkThoughts
    link
    fedilink
    62 months ago

    In before is not just skips important details in its summarization, but also hallucinates its own interpretation of things into it.

    Generally, don’t call it “AI”, don’t overhype it, don’t use it where it is bad in its function (like telling you “facts”), don’t shove it into everything. I bet 80+ percent of all “AI” energy consumption is wasted on completely useless and moronic tasks that have 0 value even on a personal level.

    • FaceDeer
      link
      fedilink
      102 months ago

      "The term “AI” has been in use since 1956 for a wide range of computer science techniques. LLMs most certainly qualify as AI. You may be thinking of the science-fiction kind of “artificial people” AI, which is a subset of AI called Artificial General Intelligence when researchers want to be specific about that kind.

      • DarkThoughts
        link
        fedilink
        22 months ago

        I’m thinking of something that actually processes some form of “thought”, in the abstract sense. Even video game AI does that to an extend (granted, there’s various techniques depending on the game type), so the term here is actually somewhat appropriate. LLMs don’t do that at all though, they’re just word guessing based on the texts they were trained upon (while we stick with text gen here at least) and that just so happens to sound like somewhat coherent sentences that can fool someone into thinking that their computer actually talked to them. There never was any sort of thought behind that though. It functions closer to how your mobile keyboard predicts the next word you want to use in its suggestions at the top. It just tries to complete the text it was already presented with. A lot of the illusion here comes actually from the tools used to display this information in a chat like manner, but that’s just frontend foolery for the user.

        • FaceDeer
          link
          fedilink
          92 months ago

          I think it’s more that you’re overestimating video game AI, here. If your definition of “abstract thought” doesn’t include what LLMs do then it definitely shouldn’t include video game AI. It’s even more illusory.

          • DarkThoughts
            link
            fedilink
            12 months ago

            Yeah but you think a lot of weird things, so that does not surprise me in the slightest.

            • @[email protected]
              link
              fedilink
              English
              52 months ago

              I would agree with the other guy. A video game AI can be as simple as some if-then decision logic, and i would count that as “AI”. An LLM also makes “decisions” on what to do/say, just via a different mechanism (predictive modeling) . I would still bucket that as AI. It you count one you should count the other. Neither are truly “thinking” in the sense of an AGI.

              • DarkThoughts
                link
                fedilink
                12 months ago

                I wasn’t talking about something like an AI in Pong. But if your definition of “AI” is conditionals such as if statements, then absolutely everything is an AI, which honestly just further muddles the meaning of that term.