How data centers could delay climate progress

The rise of artificial intelligence has been nothing short of explosive. From the chatbots that answer our questions to the recommendation engines behind shopping apps and social media feeds, AI is quickly becoming woven into daily life. But behind every seamless digital interaction lies something far less visible: vast data centers humming around the clock, consuming enormous amounts of electricity.

As AI systems grow more powerful and widely used, so does the energy required to run them. And that surge in electricity demand could complicate efforts to slow climate change.

Nick Muller of Carnegie Mellon University has been closely examining this issue. He warns that the scale of new computing infrastructure expected in the United States over the next five to ten years will require substantial new power generation. In other words, keeping up with AI’s growth will not just mean building more servers—it will mean producing much more electricity.

The central question is: Where will that electricity come from?

In an ideal world, it would come from renewable sources like wind and solar. But the reality of energy markets and infrastructure timelines means that fossil fuels could play a significant role. Some developers may turn to newly constructed natural gas plants. In other cases, older coal plants that had been retired could be restarted to meet rising demand.

That possibility alarms many climate and public health experts. Burning natural gas and coal releases carbon dioxide, the primary greenhouse gas driving global warming. It also emits pollutants such as nitrogen oxides, sulfur dioxide, and fine particulate matter—substances linked to asthma, heart disease, and premature death. Communities living near power plants often bear the brunt of these impacts.

The irony is hard to miss. AI is often marketed as a tool to improve efficiency, optimize supply chains, and even help model climate solutions. Yet the infrastructure supporting it could undermine climate progress if powered by fossil fuels.

Muller suggests one way to reduce the damage: invest in technologies that capture carbon dioxide and other pollutants before they escape into the atmosphere. Carbon capture systems can be installed at smokestacks, trapping a portion of emissions and storing them underground. Advanced pollution controls can also reduce harmful air contaminants.

In theory, such technologies could allow fossil fuel plants to operate with a smaller environmental footprint. In practice, however, carbon capture remains expensive and technically challenging. It does not eliminate all emissions, and critics argue that it prolongs reliance on fossil fuels at a time when rapid decarbonization is urgently needed.

Opponents of the fossil-heavy approach believe the solution is simpler, though not necessarily easier: power data centers with clean energy from the start.

That means siting new AI facilities in regions rich in renewable resources. West Texas and the Great Plains, for example, are known for abundant wind and sunshine. These areas already host large wind farms and solar arrays, and in many cases, renewable electricity there is among the cheapest in the country. Building data centers nearby could align AI expansion with the ongoing shift toward cleaner energy.

Location, in this sense, becomes a climate decision. A data center built in a region dominated by coal-fired power has a very different environmental impact than one connected to a grid heavy in wind and solar. The same computing task—training a large language model or serving millions of search queries—can produce vastly different emissions depending on the energy mix behind it.

There is also a timing challenge. Renewable projects can take years to permit and construct. Transmission lines are often bottlenecked by regulatory hurdles and local opposition. If AI demand surges faster than clean energy infrastructure can expand, utilities may default to whatever capacity is quickest to deploy—often natural gas.

For the general public, this debate can feel abstract. Few people think about the electricity behind a chatbot’s answer or a streaming platform’s recommendation. Yet the cumulative effect of billions of digital interactions is tangible. Data centers already account for a growing share of U.S. electricity consumption, and AI is expected to accelerate that trend.

Companies developing AI technologies have considerable influence over how this story unfolds. Long-term power purchase agreements for renewable energy, investments in battery storage, and partnerships with utilities to expand clean generation can all steer growth in a lower-carbon direction. Transparent reporting on energy use and emissions can also help investors and consumers evaluate corporate commitments.

At the same time, policymakers play a critical role. Incentives for clean energy, stricter air pollution standards, and support for grid modernization can shape the economics of power generation. If fossil fuel plants are allowed to operate without accounting for their full environmental and health costs, they will often appear cheaper than they truly are.

The tension surrounding data centers reflects a broader crossroads in climate policy. Technological progress does not automatically align with environmental sustainability. Without deliberate planning, innovation in one domain can create setbacks in another.

Artificial intelligence promises breakthroughs in medicine, transportation, education, and climate science itself. But realizing those benefits without worsening air pollution and global warming requires thoughtful choices about energy. Where data centers are built, how they are powered, and what policies guide their development will determine whether AI becomes a climate ally—or an obstacle.

As the digital economy expands, the invisible infrastructure supporting it is becoming a defining environmental issue. The servers may be out of sight, but their energy sources—and their emissions—are very much part of the climate equation.