Artificial intelligence (AI) is driving progress in medicine, transportation, industry and lots of other fields.
But beneath the breakthroughs lies a growing tension: the energy consumption of AI systems is growing rapidly, and this trend may diverge from global climate and sustainability goals.
This article examines how AI’s energy needs diverge from global environmental goals, what aspects drive this divergence, what the research reveals, and what might be done to reconcile AI development with decarbonization.
An enormous contribution of AI energy
One of probably the most striking discoveries about artificial intelligence concerns the energy needs of a knowledge center. In 2022, global electricity consumption by data centers was already approx 460 terawatt hourswhich might place data centers among the many world’s largest consumers of electricity in the event that they were one country.
Generative AI workloads similar to large language models and image/speech generation account for much of the recent increase in energy consumption in data centers.
It is estimated that training a model like GPT-3 alone used over 1,200 megawatt hours of electricity and generated lots of of tons of CO₂.
The pay rises will not be just as a consequence of training; each inference (using a model to predict or generate results) has an energy cost, and when multiplied by thousands and thousands or billions of uses per day, the sum becomes significant.
This is contrary to climate goals
Climate goals are generally based on reducing greenhouse gas emissions, shifting energy systems from fossil fuels to scrub sources, limiting global warming (for instance to 1.5-2 degrees Celsius), and preserving or restoring environmental quality (including water and air).
As AI’s energy demands increase, it will probably undermine these goals in some ways. In many places, electricity remains to be generated from coal, natural gas or other carbon-intensive sources. When AI causes increasing demand for electricity, it ends in more emissions within the absence of unpolluted energy.
Even in regions that favor renewable energy, rising demand could strain grids, force dependence on backup power generation from fossil fuels, or raise electricity costs, thereby slowing down the transition in other sectors.
Research suggests that data center energy demand could almost double in some regions, or that some projections suggest that artificial intelligence could account for as much as half of information center electricity consumption.
In addition to energy, the environmental footprint of AI includes water consumption (for cooling, production), hardware production and possible electronic waste. They also place a strain on environmental systems and resources.
AI also shows some promise
Despite disturbing trends, not all research concludes that artificial intelligence is totally inconsistent with environmental goals.
In some settings, AI will help improve efficiency, reduce waste, optimize routes and logistics, improve energy use in manufacturing, smart buildings and other systems.
Some projects show reductions in emissions or energy consumption when AI is used to predict, control or optimize.
For example, studies in China show that the appliance of artificial intelligence in non-polluting industries or in state-owned enterprises pursuing energy goals has resulted in modest but measurable reductions in energy consumption.
However, even where AI delivers efficiencies, overall demand growth could still outpace these savings unless the efficiency gains are very large or energy sources are clean.
What must be modified
To reconcile the event of artificial intelligence with global environmental goals, we’d like multi-directional actions.
First, there must be greater transparency: full disclosure of energy use, emissions, water use and other resource impacts, especially throughout the lifecycle of AI systems (hardware production, deployment, retirement).
Second, energy sources for AI infrastructure must decisively shift to renewable energy sources; if AI systems are powered mainly by fossil fuels, it should cause environmental damage.
Third, AI research can prioritize efficiency: not only pushing for larger models or more data, but in addition for smarter model selection, efficient inference, and reduced waste. The “little is enough” approach holds promise.
Fourth, regulatory or policy incentives could also be required: governments could set energy efficiency standards for data centers, provide incentives for green AI infrastructure, or limit energy use to AI workloads in regions where network capability or environmental costs are particularly high.







