“There’s no way to get there without a breakthrough,” OpenAI CEO Sam Altman said, arguing that AI will soon need even more energy.

  • @[email protected]
    link
    fedilink
    English
    3
    edit-2
    10 months ago

    Lol, “old M1 laptop” 3 to 4 years is not old, damn!

    (I have running macbookpro5,3 (mid 2009) on Arch, lol)

    But nice to hear that M1 (an thus theoretically even the iPad, if you are not talking about M1 pro / M1 max) can already run llamma v2 7B.

    Have you tried the mistralAI already, should be a bit more powerful and a bit more efficient iirc. And it is Apache 2.0 licensed.

    https://mistral.ai/news/announcing-mistral-7b/

    • TheRealKuni
      link
      fedilink
      English
      310 months ago

      But nice to hear that M1 (a thus theoretically even the iPad, if you are not talking about M1 pro / M1 max) can already run llamma v2 7B.

      An iPhone XR/XS can run Stable Diffusion, believe it or not.

    • @[email protected]
      link
      fedilink
      English
      2
      edit-2
      10 months ago

      3 to 4 years is not old

      Huh, nice. I got the macbook air secondhand so I thought it was older. Thanks for the suggestion, I’ll try mistralAI next, perhaps on my phone as a test.