• @[email protected]
    link
    fedilink
    English
    73
    edit-2
    1 day ago

    for example:

    get off of facebook (easy). don’t buy tesla or use starlink (easy). don’t buy on amazon (difficult but doable). Don’t upgrade your iphone, and don’t buy new apple products (moderate). Don’t use CHATGPT (easy).

    • Diplomjodler
      link
      fedilink
      661 day ago

      Use Linux and open source software. Contribute to open source projects. Buy hardware second hand. Use non corporate social media. Buy local. Get your stuff fixed instead of throwing it away. Avoid data harvesting where possible.

      • haui
        link
        fedilink
        221 day ago

        Its as if the linux hardliners were right all along. Almost as if people laughing about our cautionary tales stand there holding the bag now.

        I‘m not saying „we told you so“ but…

    • @[email protected]
      link
      fedilink
      91 day ago

      I’d add not using Amazon Prime, Amazon Web Services and other Amazon services. Not using X, being critical of SpaceX. Also, stop advertising these things, stop telling your friends about them, maybe even stop talking about them altogether. I think for some strange reason sometimes bad press is better than no press.

      • @[email protected]
        link
        fedilink
        523 hours ago

        My company: “we’re going all in on the cloud! Specifically AWS” proceeds to pay them millions per year.

    • tiredofsametab
      link
      fedilink
      91 day ago

      or use starlink I don’t know if I’d call that easy for some people in very remote areas. Easy for me, easy for you, but not necessarily easy in some cases. Here’s hoping a good competitor can get to those places.

    • @[email protected]
      link
      fedilink
      223 hours ago

      I use chatgpt a lot, what is the best non-billionaire funded llm? I really need to change to one that doesn’t worsen the world…

      • @[email protected]
        link
        fedilink
        28 hours ago

        That may be hard, seeing as all AIs use ungodly amounts of electricity. So I’d say they all worsen the world.

      • @[email protected]
        link
        fedilink
        322 hours ago

        While deepseek is billionaire funded it still should be better if run locally I don’t think Foss llms are at that level yet

      • @[email protected]
        link
        fedilink
        222 hours ago

        Try hosting locally DeepSync R1, for me the results are similar to ChatGPT without needing to send any into on the internet.

        LM Studio is a good start.

          • @[email protected]
            link
            fedilink
            15 hours ago

            Any relatively new gaming PC from the last, what, 4? Years has enough power to run local LLMs. Maybe not the ginormous 70GB behemoth models, but the toned down ones are pretty damn good and if you don’t mind waiting a few seconds while it thinks, you can run it completely locally as much as you want, and whenever you want.

          • @[email protected]
            link
            fedilink
            1
            edit-2
            12 hours ago

            You would benefit from it with some GPU offloading, this would considerably accelerate the speed of the answers. But you only need enough RAM to load the model at the bare minimum.