With the rapid development and deployment of powerful Generative AI models, the environmental issues they bring are increasingly attracting attention, including a sharp rise in electricity demand and high water consumption.
The Resource Intensity and Environmental Cost of Generative AI
Generative AI has attracted widespread attention due to its potential immense advantages, from enhancing work efficiency to driving scientific research, making its application scenarios exciting. However, the environmental impact brought about by the rapid development of this technology is difficult to quantify, let alone effectively mitigate. Training generative AI models (such as OpenAI’s GPT-4) requires enormous computing power, and these models often contain billions of parameters. This not only consumes a large amount of electricity, increasing carbon dioxide emissions, but also puts pressure on the operation of the power grid.
Moreover, even after the model is developed, deploying it for practical applications (such as allowing millions of people to use generative AI tools daily) and fine-tuning the model to improve performance will continue to consume a significant amount of energy.
In addition to electricity demand, the hardware required for training, deploying, and fine-tuning generative AI models also requires a large amount of water resources for cooling. This puts pressure on local water supply systems and may disrupt surrounding ecosystems. At the same time, the increasing popularity of generative AI drives the demand for high-performance computing hardware, whose manufacturing and transportation processes bring indirect environmental impacts.
Elsa A. Olivetti, a professor in the Department of Materials Science and Engineering at MIT and head of the MIT Climate Project’s decarbonization task, stated that the environmental impact of generative AI is not just the electricity consumption when plugging in a computer, but rather a broader systemic issue that may persist due to our actions.
Professor Olivetti is a senior author of a paper titled “The Climate and Sustainability Impacts of Generative AI” published in 2024. The paper, co-authored by several scholars from MIT, aims to explore both the positive and negative transformations generative AI brings to society.
High-Energy Consumption Data Centers
The electricity demand of data centers is one of the main factors contributing to the environmental impact of generative AI. These data centers are used to train and run deep learning models, providing the technical support for popular tools like ChatGPT and DALL-E.
A data center is a temperature-controlled building housing servers, data storage devices, and network equipment. For example, Amazon has over 100 data centers worldwide, each with about 50,000 servers supporting cloud computing services.
Although the history of data centers dates back to the 1940s (the first data center supporting the general-purpose digital computer ENIAC was built at the University of Pennsylvania in 1945), the rise of generative AI has significantly accelerated the pace of data center construction.
Noman Bashir, a researcher at the MIT Climate and Sustainability Consortium (MCSC) and a postdoctoral researcher at the Computer Science and Artificial Intelligence Laboratory (CSAIL), pointed out that what makes generative AI unique is its high power density. A generative AI training cluster may consume 7 to 8 times the energy of typical computational workloads.
Researchers estimate that the electricity demand of North American data centers has increased from 2,688 megawatts at the end of 2022 to 5,341 megawatts at the end of 2023, with part of the growth attributed to the demand for generative AI. Globally, data centers consumed 460 terawatt-hours (TWh) of electricity in 2022, equivalent to the 11th largest electricity-consuming country in the world, ranking between Saudi Arabia (371 TWh) and France (463 TWh).
It is expected that by 2026, the electricity consumption of data centers will approach 1,050 TWh, which would elevate them to the fifth position in global electricity consumption rankings, just behind Japan and Russia.
Although not all computational work in data centers involves generative AI, this technology is one of the main drivers of increased energy demand.
Bashir stated that the demand for new data centers cannot be met sustainably. The speed at which companies are building data centers means that most electricity must come from fossil fuel power plants.
For example, OpenAI’s GPT-3 has a training process whose electricity demand is difficult to quantify accurately. A study in 2021 estimated that the training process alone consumed 1,287 megawatt-hours of electricity (enough to power 120 average American households for a year) and produced about 552 tons of carbon dioxide.
Bashir pointed out that the unique problem with generative AI is the fluctuation in energy consumption during its training process. This fluctuation requires grid operators to take measures to absorb it, often relying on diesel generators to do so.
Ongoing Impact During Inference Phase
Even after a generative AI model is trained, its energy demand continues to exist. For example, each time a user asks a question to ChatGPT, the computing hardware executing these operations consumes energy. It is estimated that each query to ChatGPT consumes about five times the electricity of a simple web search.
However, ordinary users are often unaware of these environmental costs and have no motivation to reduce their use of generative AI.
The inference calculations of generative AI models are becoming a major source of energy consumption. As generative AI becomes more prevalent in various applications and the scale and complexity of models continue to increase, the electricity required for inference will continue to grow.
Moreover, the “lifecycle” of generative AI models is often relatively short. Due to the rising demand for new AI applications, companies release new models every few weeks, causing the energy consumed during the training of previous versions to be wasted. New models typically have more parameters and higher training energy consumption.
Environmental Impact of Cooling Water and Hardware Manufacturing
In addition to electricity consumption, the cooling water used by data centers also has environmental impacts. It is estimated that for every kilowatt-hour of energy consumed, data centers require 2 liters of water for cooling.
Bashir stated that the hardware in data centers is not truly “in the cloud,” and its water usage has direct and indirect impacts on biodiversity.
Furthermore, the manufacturing and transportation of high-performance hardware (such as GPUs) used for generative AI also come with environmental costs. The manufacturing process of GPUs is complex, consuming more energy than ordinary CPUs, while also involving issues such as carbon emissions and raw material extraction.
According to market research firm TechInsights, the three major GPU manufacturers (NVIDIA, AMD, and Intel) delivered 3.85 million GPUs to data centers in 2023, compared to 2.67 million in 2022. This number is expected to grow significantly in 2024.
Moving Towards Sustainable Generative AI
Although the current trajectory of generative AI development is unsustainable, it is still possible to promote its responsible development through a comprehensive assessment of its environmental and social costs and a thorough consideration of its potential benefits.
Professor Olivetti concluded that the rapid advancement of generative AI technology exceeds our capacity to understand its trade-offs and impacts, necessitating a more systematic and comprehensive approach to analyze its environmental impact.
Follow the public account 【True Intelligence AI】
TOP AI Model Smart Q&A | Drawing | Image Recognition | Document Analysis
Daily sharing AI tutorials, money-making tips, and cutting-edge information!