The brand new AI Index report points out the vast amount of energy that AI like ChatGPT require — though there may be a positive side, too
A small but essential a part of the AI Index Report for 2023¹ points to the growing concern in regards to the energy consumption required for AI training.
Spoiler alert: it’s quite quite a bit.
There’s no standard benchmark for tracking the carbon intensity of AI systems, so the report focuses on research from a recent paper by Luccioni et al., 2022² which records the energy requirements of plenty of large language models (LLMs) including ChatGPT.
The next table shows the energy requirements for training 4 different AI models and the CO2 emissions related to it.
The info comprises plenty of measurements but the underside line is represented by the facility consumption and CO2 emissions which I actually have summarised within the charts below.
There is sort of a difference between the varied models and, as you may see, OpenAI’s GPT-3 comes top with a consumption of over 1200 Megawatt-hours. That’s about as much electricity as 120 US homes would devour in a 12 months based on consumption figures by the U.S. Energy Information Administration³. That actually looks like numerous energy.
The chart below illustrates the CO2 emissions which follow the same pattern.
Luccioni, the paper’s principal creator, is a researcher at Hugging Face Inc. and the work is generally concerned with BLOOM, her company’s alternative to ChatGPT. The figures for other models are approximate and based on what public information is on the market (Bloomberg reports Lucciana saying that nothing is absolutely known about ChatGPT and that it could just be “…three raccoons in a trench coat.” — does that mean GPT-4 will likely be 4 raccoons?).
CO2 emissions for training ChatGPT are reminiscent of around 500 flights from Latest York to San Francisco
The AI Index Report makes some comparisons with other energy-intensive activities and their CO2 emissions (see chart, below). It finds for instance, that the CO2 emissions generated in training ChatGPT are reminiscent of one passenger taking a flight from Latest York to San Francisco around 500 times! Or the whole energy consumption of a single American over 28 years!
Unsurprisingly, the one air passenger doesn’t produce zero emissions as it could appear from the chart above (the figure is almost 1 tonne). You possibly can see the actual numbers more clearly on this table:
Nevertheless it’s not all bad news.
AI may reduce energy consumption
Based on Bloomberg, while AI models are getting larger (and presumably more energy intensive), the businesses creating them are working on improving efficiency. Microsoft, Google and Amazon — the cloud firms that host much of the work — all are aiming for carbon-negative or carbon-neutral operations. That is, after all, highly desirable.
Also, while training AI systems is energy-intensive, recent research shows that AI systems can be used to optimize energy consumption. A paper from DeepMind⁴ released in 2022 details the outcomes of a 2021 experiment wherein it trained an AI called BCOOLER to optimize cooling in Google’s data centres.
The graph above shows the energy-saving results from one BCOOLER experiment. After three months, a, roughly, 12.7% energy saving was achieved.
Even when carbon neutrality is achieved, using AI to extend the efficiency of those centres may also make them cheaper to run. Perhaps we ought to be fascinated with applying AI to other energy-intensive industries, too.
I doubt that we’re currently ready to know exactly what the eventual toll on the environment will likely be. LLMs like ChatGPT should not going away and so the energy that should be spent in training them is unquestionably going to be spent. Then again, it’s not the case that individuals are going to stop flying NY to SF, heating their homes or using their cars.
But we should always try to put a few of this somewhat shocking data into perspective. While a ChatGPT training session might use as much energy as one American does in 28 years (which sounds an awful lot), it’s also true that 330 million Americans, the population of the USA, emit around 10 million times more CO2 than a single ChatGPT session⁵.
And there look like around 20 flights a day from Latest York to San Francisco, and say that every flight serves 150 passengers; that works out to be over 1 million tonnes of CO2 emissions per 12 months — greater than 2000 ChatGPTs⁵.
For single entities, ChatGPT, and its like, clearly use numerous energy (and thus — in the mean time, no less than — produce numerous CO2 emissions) but in comparison with energy consumption and CO2 emissions from other human activity, are they really very significant (there are, in spite of everything, quite a bit more humans than LLMs)?
Also, it’s got to be excellent news that the big cloud hosting firms are aiming to realize carbon neutrality which, if achieved, will reduce CO2 emissions to zero. So while energy use might remain high, the aim is to make its environmental impact neutral.
Moreover, AI could be used to mitigate a number of the energy use in data centres. Perhaps similar technology may very well be utilized in airlines and other energy-intensive industries.
The underside line, nonetheless, is that we’re all producing more CO2 than we should always, so any additional energy use, that is just not produced from renewables, is moving within the unsuitable direction.
Thanks for reading, I hope you found this handy. When you would love to see more of my work, please visit my website.
You may as well get updates by subscribing to my occasional, free, newsletter on Substack.
When you should not a Medium member you may enroll using my referral link and get to read any Medium content for less than $5 per 30 days.
References
- The AI Index 2023 Annual Report
Nestor Maslej, Loredana Fattorini, Erik Brynjolfsson, John Etchemendy, Katrina Ligett, Terah Lyons, James Manyika, Helen Ngo, Juan Carlos Niebles, Vanessa Parli, Yoav Shoham, Russell Wald, Jack Clark, and Raymond Perrault, “The AI Index 2023 Annual Report,” AI Index Steering Committee, Institute for Human-Centered AI, Stanford University, Stanford, CA, April 2023.
The AI Index 2023 Annual Report by Stanford University is licensed under Attribution-NoDerivatives 4.0 International.
You will discover the entire report on the AI Index page at Stanford University.
5. CO2 emissions from other sources (these are rough calculations):
330 million Americans emit 18 tonnes of CO2 annually, that’s 330m x 18, 5900m tonnes of CO2–10 million ChatGPTs.
Approx. 20 flights (every day), NY to SF, with around 150 passengers on board produce 20 x 150, or 3000 tonnes of CO2. That’s 3000 x 365, about 1 million tonnes of CO2 per 12 months — 2000 ChatGPTs.