• @[email protected]
    link
    fedilink
    English
    996 months ago

    I think it’s important to note that Linux can be a way to avoid AI, but doesn’t have to be. If you flip the headline around it almost implies that people who do want AI would be missing out by using Linux, but that’s not true at all: instead, the reality is that Linux is still better for them, too, because you could install all the same kind of functionality if you wanted, but it would be wholly under your control, not Microsoft’s.

    • @[email protected]
      link
      fedilink
      English
      34
      edit-2
      6 months ago

      Self hosted AI seems like an intriguing option for those capable of running it. Naturally this will always be more complex than paying someone else to host it for you but it seems like that’s that only way if you care about privacy

      https://github.com/mudler/LocalAI

      • @[email protected]
        link
        fedilink
        English
        36 months ago

        Check out Jan AI. It’s open source and extremely easy to install and run. I run it locally on a 2017 laptop without a dedicated GPU and it works, just takes longer to generate responses compared to something like ChatGPT.

    • @[email protected]
      link
      fedilink
      English
      96 months ago

      Beautifully stated. Owning the AI personally as I own my personal computer if not more is the key.

    • @[email protected]
      link
      fedilink
      English
      36 months ago

      That sounds very cool. I’m totally ignorant of the hardware requirements. What sort of minimum setup would such an install take?

      • @[email protected]
        link
        fedilink
        English
        56 months ago

        It really depends on what model you want to run and how much training is bundled with it. You can pretty much run any model if you have enough disk space but of course GPU + VRAM is preferred for a ChatGPT like fast response. Otherwise, running on an older CPU and RAM is going to be noticeably slower, especially with complex models with a lot of training data to trawl through.

        There are some pretty lite models out there but the responses will be more barebones and probably seem ‘less informed’.

        Give GPT4All a try for your first time. It makes install, configuration and usage point-and-click while being fairly straight forward. For the presented/featured models, it presents a small summary and VRAM recommended, though there are many, many other models available from inside the UI.