• @[email protected]
    link
    fedilink
    English
    27 months ago

    I think there may be a market for an LMM that is executed locally and privately incorporates personal data.

    • @[email protected]
      link
      fedilink
      English
      27 months ago

      Yes, there is. And yes, it would be huge. I know a lot of people that are staying away from all this as long as the privacy issues are not resolved (there are other issues, but at this point, the cat is out of the bag).

      But running large models locally requires a ton of resource. It may become a reality in the future, but in the meantime allowing more, smaller provider to provide a service (and a self-hosted option, for corporation/enthusiasts) is way better in term of resources usage. And it’s already a thing; what needs work now is improving UI and integrations.

      In fact, very far from the “impressive” world of generated text and pictures, using LLM and integrations (or whatever it is called) to create a sort of documentation index that you can query with natural language is a very interesting tool that can be useful for a lot of people, both individual and in corporate environment. And some projects are already looking that way.

      I’m not holding my breath for portable, good, customized large models (if only for the economics of energy consumption) but moving away from “everything goes to a third party service provider” is a great goal.