Big tech has made some big claims about greenhouse gas emissions in recent years. But as the rise of artificial intelligence creates ever bigger energy demands, it’s getting hard for the industry to hide the true costs of the data centers powering the tech revolution.

According to a Guardian analysis, from 2020 to 2022 the real emissions from the “in-house” or company-owned data centers of Google, Microsoft, Meta and Apple are likely about 662% – or 7.62 times – higher than officially reported.

Amazon is the largest emitter of the big five tech companies by a mile – the emissions of the second-largest emitter, Apple, were less than half of Amazon’s in 2022. However, Amazon has been kept out of the calculation above because its differing business model makes it difficult to isolate data center-specific emissions figures for the company.

As energy demands for these data centers grow, many are worried that carbon emissions will, too. The International Energy Agency stated that data centers already accounted for 1% to 1.5% of global electricity consumption in 2022 – and that was before the AI boom began with ChatGPT’s launch at the end of that year.

AI is far more energy-intensive on data centers than typical cloud-based applications. According to Goldman Sachs, a ChatGPT query needs nearly 10 times as much electricity to process as a Google search, and data center power demand will grow 160% by 2030. Goldman competitor Morgan Stanley’s research has made similar findings, projecting data center emissions globally to accumulate to 2.5bn metric tons of CO2 equivalent by 2030.

    • @[email protected]
      link
      fedilink
      23 months ago

      Approaches in good faith*

      Why do you think its so crazy to create an ai god, as you put it (agi)? Can i ask your background on that topic? Im a dumb new software engineer and i use LLMs. They fascinate me with the potential for mass accessible education and quantifying huge amounts of information for new insight.

      I know some tech companies are investing a lot into energy solutions because the energy problem seems very real. I agree the acceleration of resource usage seems pretty crazy. Are their any opinions from top minds in the industry that raise similar concerns? Between uneducated memeing doomers on here, and self interested companies using it, its hard to objectively talk about ai. Isnt it kind of like… its here and not going away, and we should be on the cutting edge to not let bad actors capitalize with ai (fraud is crazy rampant since ai).

      • @[email protected]
        link
        fedilink
        English
        6
        edit-2
        3 months ago

        For me it’s because I’m not convinced LLMs are really a stepping stone to any actual AI. They don’t have educational applications imo because there isn’t any way they can separate truth from fiction. They don’t understand the words that they output; they’re just predictive text generators on a huge scale. This isn’t something that can change with better tech either; it’s baked in to the very concept of an LLM. And worse, when they are wrong there’s no way to tell without already knowing the answer to the questions you’re asking. They’re literally just monkeys with typewriters. This is an extremely good article about the kinds of problems I’m taking about.

        • @[email protected]
          link
          fedilink
          13 months ago

          Thanks. I sped read and hope its okay if i raise some quick thoughts.

          I thought it was interesting how it mentioned LLMs arent a mind that is formed in nature. I would offer a dumb conjecture that agi, while a mind, might still need an LLM as a component to actually handle the amount of data of a society. Like you said, LLMs are useful if you know the answer or at least suspect when to revisit a result. Maybe we are missing the biggest piece of agi, but handling data is really important and this still benefits us right? I think we will need more than a mind from our local nature to create god.

          Im a pretty skeptical person. When i used chatgpt i was pretty blown away and wouldnt say i was leaning into the idea that it was sentient. I just saw an incredible new tool, and through using it, now understand the pitfalls and can get awesome results that would have never been achieved with googling in the amount of time i spent. Most all of the heavy lifting i have it do i immediately verify through testing and its correct often enough to realize huge gains over googling and my local library etc…

          I think the criticisms of LLMs and their capability arent inaccurate but maybe short sighted? I think criticsms should currently focus on its performance for how we are using it… not how laymens might imagine using something they dont understand. Ultimately any use cases should be heavily tested and perform more accurately than human counterparts (where we are talking about replacement of humans anyway). If we dont find the gains from those applications to validate the power use… or whatever… then we are always capable of recognizing that.

          But i think its 100% valid to push back against idealized predictions but i also think shits gonna get crazy. I think theres a lot to be gained, and i question why LLMs cant be a stepping stone to greater computing milestones even if LLMs themselves aren’t a component of agi in the end.

          What im trying to be convinced of is the criticisms arent as overblown as the hype.

  • @[email protected]
    link
    fedilink
    43 months ago

    Meta’s emissions were 3000x higher than they reported?! What the heck are they doing over there

  • @[email protected]
    link
    fedilink
    33 months ago

    RECs and similar market based methods for Scope 2 accounting are complete bullshit and need to be removed from the GHG protocol.

    It’s not driving a transition to renewables just literally just giving companies permission to claim their emissions are lower without actually changing anything.

    • @[email protected]
      link
      fedilink
      23 months ago

      These are certificates that a company purchases to show it is buying renewable energy-generated electricity to match a portion of its electricity consumption – the catch, though, is that the renewable energy in question doesn’t need to be consumed by a company’s facilities. Rather, the site of production can be anywhere from one town over to an ocean away.

      If I understand this correctly, a tech firm with a data centre in Melbourne could buy RECs from Helsinki, pocket the certificate, and on-sell the energy to someone who needed it in Helsinki without the certificate?

      • @[email protected]
        link
        fedilink
        13 months ago

        Yea basically. Creative accounting abuses of RECs are rampant. There’s no tangible product or service delivered when you buy a REC so there’s nothing stopping a bad actor from selling the same “REC” to more than one buyer.

        But more importantly, RECs don’t work to reduce GHG emissions even if they’re purchased and sold in good faith. RECs don’t change anything, that’s the problem. They don’t reduce electricity usage, or change the grid mix. All RECs do is give a company the ability to claim that it was magically someone else’s electricity that resulted in fossil fuels being burned and not their’s. Companies that buy RECs are paying to shift the blame onto companies that didn’t.

        Back when solar and wind was more expensive than fossil fuels it may have made sense to offer companies the option of paying extra to get “green” power that otherwise wouldn’t have made financial sense. But now that wind and solar are cheaper than coal and nat gas, utility providers will buy all available green power regardless of RECs.

        The bottleneck to building more renewable power isn’t money. Companies paying for RECs aren’t making that happen any faster, they’re just Greenwashing their ESG reporting.