Quantifying The Environmental Impact Of AI Data Centers
As the so-called “hyperscalers” of the tech industry have invested vast sums in expanding the data centers that power the latest generation of AI models, the environmental costs associated with this have garnered fresh attention.
Thus far, it’s not proved to be easy to truly understand the environmental impact data centers have, not least because the industry is required to self-report a lot of the time. A recent study from Cornell aims to do a better job.
Environmental impact
The researchers utilize data analytics and a dollop of AI to develop what they believe is a state-by-state analysis of data centers’ environmental impact. The study found that the current rate of growth in data center capacity would pump up to 44 million metric tons of C02 into the atmosphere by 2030. That’s equivalent to adding around 10 million cars to America’s roads.
The situation is no better when it comes to water either, with growth in AI set to drain around 1,125 million cubic meters of water per year. That’s equivalent to the annual household water usage of 10 million Americans. Perhaps unsurprisingly, this makes it practically impossible for the United States to meet its net-zero targets.
Are things irreparable? The researchers don’t think so, and outline a roadmap that could make the AI transformation more sustainable. This includes things like faster grid decarbonization and smart siting. If these steps are taken, the researchers believe that the carbon emissions could drop by 73% and water usage by 86%.
“Artificial intelligence is changing every sector of society, but its rapid growth comes with a real footprint in energy, water and carbon,” the researchers explain.
“Our study is built to answer a simple question: Given the magnitude of the AI computing boom, what environmental trajectory will it take? And more importantly, what choices steer it toward sustainability?”
Testing the footprint
The true environmental footprint of AI was compiled by gathering data across a range of fields, including manufacturing, financial, and even marketing data. The aim was to understand not only the growth in the sector but also the locations of this growth. They then combined this with data on resource consumption and power systems to see how it all tied together with changes in the climate.
“There’s a lot of data, and that’s a huge effort. Sustainability information, like energy, water, and climate, tends to be open and public. But industrial data is hard, because not every company is reporting everything,” the researchers explain. “And of course, eventually, we still need to be looking at multiple scenarios. There’s no way that one size fits all. Every region is different for regulations. We used AI to fill some of the data gaps as well.”
Suffice it to say, it’s not enough simply to project the impact of AI on the environment. The researchers also wanted to provide guidance so that the seemingly inevitable investment in AI infrastructure is made in as sustainable a way as possible.
Location matters
The results show that there generally isn’t one single thing that has an overwhelming impact. Instead, it’s a combination of decarbonizing the electricity grid, making sure data centers are efficiently operated, and the location that collectively can make a difference.
The last point was perhaps the most interesting, though. For instance, at the moment, the hyperscalers locate data centers in areas with water shortages already. Similarly, in areas like northern Virginia, the rapid rise in data centers is placing significant strain on the local infrastructure.
Simply locating data centers in areas with more ample water supplies could reduce water demands by over 50%, especially if coupled with greater cooling efficiency. If this was also combined with improvements to grid efficiency, it could reduce water usage by around 86%.
The researchers argue that states in the Midwest or the so-called “windbelt” would be ideal for this. This would include Nebraska, Texas, and Montana.
Decarbonization is key
While location is crucially important, the researchers don’t ignore the fact that decarbonizing the electricity grid remains hugely important, if also significantly harder to achieve. They say that if the pace of decarbonization doesn’t improve to meet the rising demand placed by data centers, then emissions could rise by around 20%.
“Even if each kilowatt-hour gets cleaner, total emissions can rise if AI demand grows faster than the grid decarbonizes,” the researchers explain. “The solution is to accelerate the clean-energy transition in the same places where AI computing is expanding.”
The authors acknowledge, however, that while decarbonization is important, it’s not a silver bullet. Indeed, even if we achieve much higher levels of decarbonization than we are now, it might not be enough given the rapacious appetite of the industry. It’s estimated that by 2030, even with high levels of renewable energy, we’d need 28 gigawatts of wind or 43 gigawatts of solar capacity to reach net zero.
Ultimately, we need a lot more coordination between the hyperscalers and utilities, and regulators to ensure that AI can develop in a way that doesn’t drain water and energy resources (or contribute massively to emissions).
“This is the build-out moment,” the researchers conclude. “The AI infrastructure choices we make this decade will decide whether AI accelerates climate progress or becomes a new environmental burden.”

