AI's Exponential Growth Strains Global Energy and Water Resources
TL;DR
- The global AI data center build-out is projected to double electricity consumption by 2030, consuming 1.5% of world electricity in 2024, with investments exceeding global oil supply development.
- AI data centers, particularly in water-stressed regions, consume significant water for cooling, with over 60% of this use being indirect via power plants, impacting local grids.
- Individual AI queries consume small amounts of energy (0.24-0.34 watt-hours), but the sheer volume of daily queries accumulates into substantial total energy demand for AI activities.
- Companies are exploring water-efficient cooling methods like direct liquid and immersion cooling, but these alternatives can be more expensive or electricity-intensive than traditional evaporative cooling.
- Community pushback against rising utility rates and environmental impacts has led to the delay or cancellation of approximately $93 billion in data center projects since March 2022.
- The rapid growth of AI demand creates a challenge for energy infrastructure, as building new energy sources, including renewables and nuclear, takes significantly longer than data center construction.
- AI integration into digital infrastructure, from search suggestions to ad delivery, shifts the conversation from personal choices to a systemic challenge requiring broader societal discussion.
Deep Dive
The rapid construction of data centers to power the AI boom presents a significant, growing challenge to global energy and water resources, with profound implications for utility rates and community infrastructure. While individual AI queries consume a modest amount of energy, the sheer volume of these requests, combined with the energy-intensive nature of data centers, is projected to double global electricity consumption by 2030, outpacing even the investment in global oil supply. This surge in demand is directly contributing to rising utility rates, sparking community backlash and leading to project delays or cancellations.
The water footprint of AI is equally concerning. Data centers, particularly those located in water-stressed regions like Arizona and Nevada, require substantial amounts of water for cooling. While some of this is indirect use from power plants, a significant portion is drinking-quality water needed to prevent equipment damage. Projections indicate that data center water usage will double between 2023 and 2030, with single facilities sometimes consuming more water than entire counties. Furthermore, the production of the chips themselves adds to this water demand, accounting for up to 10% of AI's total water use.
These resource demands are creating political pressure, as seen in recent election campaigns focused on lowering utility rates. Communities are pushing back against projects that threaten to strain local resources, with billions in data center investments already delayed or canceled due to this opposition. This public outcry is forcing companies to explore more efficient cooling technologies, such as direct liquid cooling and immersion cooling, though these alternatives often come with higher costs or increased energy consumption, presenting a trade-off.
The core implication is that AI's integration into daily life is rapidly outstripping the pace of energy infrastructure development. While companies have made climate pledges, the exponential growth in AI demand makes achieving net-zero goals a moving target. This necessitates a systemic conversation about resource allocation and sustainability, moving beyond individual choices to address how AI, as a fundamental digital infrastructure, impacts broader societal resources and long-term environmental stability.
Action Items
- Audit AI query energy use: Calculate average watt-hours per query for 3-5 models and estimate total daily consumption based on 10 billion daily queries.
- Measure data center water stress impact: For 5-10 data centers in water-stressed regions, quantify their water usage against county-level residential consumption.
- Evaluate cooling system efficiency: Compare water and energy usage of evaporative cooling versus direct liquid or immersion cooling for 3-5 data center designs.
- Track project delays/cancellations: Monitor 10-20 AI data center projects for reasons of community pushback or utility rate increases.
- Assess renewable energy procurement: For 3-5 major tech companies, quantify their renewable energy procurement against their reported AI energy demand increase.
Key Quotes
"According to the international energy agency data centers accounted for about you know 1 5 of the world's electricity consumption in 2024 and that's set to double by 2030 we're seeing so many data centers get built out one number that really struck me was in a recent report uh that there was 580 billion invested globally in ai in 2025 and data centers that's more than the 540 billion spent on developing the global oil supply so this year we spent more on data centers than the oil supply wow"
Casey Crownheart highlights the immense and growing scale of data center construction for AI. She points out that data centers' share of global electricity consumption is projected to double by 2030, and that investment in AI data centers in 2025 is expected to surpass the global investment in oil supply. This indicates a significant shift in energy demand towards digital infrastructure.
"yes so since we you know came up with our own estimates working with leading researchers in this area a couple of companies have come out with estimates of you know how much energy each query or each kind of question you ask to one of its models will use and it turns out we were kind of in the right region so google came out with an estimate that says that you know the average query to its gemini model uses about 0 24 watt hours of electricity um that you know and i like to put things in microwave seconds so that's about the same as a second in the microwave um chatgpt openai came out with its own estimate it's kind of in the same range about 0 34 watt hours so basically the the individual queries are kind of individual questions you're asking it's an not insignificant amount of energy but it's kind of small but they don't say the total energy used for all of its ai activities right exactly"
Casey Crownheart explains that while individual AI queries consume a relatively small amount of energy (comparable to a second in a microwave), the cumulative effect of billions of daily queries is substantial. She notes that companies are beginning to release estimates for individual queries, but this data does not provide a complete picture of the total energy consumption for all AI activities. This suggests that the overall energy footprint of AI is significantly larger than what per-query estimates might imply.
"two thirds of new data centers that are in development since 2022 are in these kind of water stressed areas yeah and how much of this is is drinking water here what you know and what happens to that water after it's used for cooling the servers it depends on what the water is being used for so when we're talking about water consumption for ai a lot of it is actually what's called indirect use so the water that's used at the power plants that are actually running the data centers you know some estimates say that over 60 of the water consumed when we're talking about ai is is from power plants so it kind of depends on what segment of this you're talking about you know some power plants are able to use treated water or something um but when it comes to the actual data centers and the water that they're using to keep their machines cool a lot of them do need to use drinking quality water because when they're you know doing this evaporative cooling they want to avoid like clogging their pipes bacterial growth it's you know very sensitive equipment"
Casey Crownheart details the significant water usage associated with data centers, particularly in water-stressed regions. She clarifies that a large portion of this water consumption is indirect, occurring at power plants that supply electricity to data centers. However, she also points out that data centers themselves often require drinking-quality water for cooling to prevent equipment damage from clogging or bacterial growth.
"so today a lot of data centers are cooled with what's called evaporative cooling and so basically you know you just let the water evaporate and that cools down the equipment that obviously you lose a lot of the water that you're kind of pulling out of the resources um so there are other techniques so something like direct liquid cooling where you know you have a coolant kind of circulating directly through the servers -- there's also immersion cooling where servers are submerged in some sort of fluid to help keep them cool so there's a lot of interesting alternatives some of them at least right now tend to have some sort of downside so they're either more expensive in some cases they're also more electricity intensive so you know might use as much as 10 more energy as compared to evaporative cooling so it's kind of a trade off but we are seeing you know a lot of companies are sensitive to this especially as we've seen this public outcry"
Casey Crownheart discusses alternative cooling methods for data centers beyond traditional evaporative cooling, which leads to significant water loss. She mentions direct liquid cooling and immersion cooling as potential solutions. However, Crownheart notes that these alternatives can be more expensive or require more electricity, presenting a trade-off. She also observes that companies are becoming more responsive to public concerns about resource usage.
"i think that you know as people continue to see prices go up across the board i think there's you know even more sensitivity to this and so we're seeing a lot of projects blocked one one recent report from data center watch found that from just march to june about 93 billion worth of projects were either delayed or canceled because of community pushback what happened to all those climate pledges that google and microsoft and others were promising a few years back yeah that this is a great question"
Casey Crownheart connects rising utility costs to increased community pushback against data center construction. She cites a report indicating that billions of dollars in data center projects have been delayed or canceled due to this opposition. Crownheart also raises the question of how these developments align with previous climate pledges made by major tech companies, suggesting a potential conflict between ambitious environmental goals and the practical demands of AI infrastructure expansion.
"i think that this is overall a systems conversation that we need to be having rather than you know talking about personal choices and personal use good point casey you always bring good stuff to us thank you for taking time to be with us today thanks so much for having me"
Casey Crownheart emphasizes that the impact of AI on resource consumption, such as electricity and water, should be viewed as a systemic issue rather than a matter of individual choices. She suggests that AI is becoming integrated into digital infrastructure, influencing everything from search results to advertisements, making it a collective challenge. This framing shifts the focus from personal usage to broader societal and infrastructural considerations.
Resources
External Resources
Books
- "The Times" - Mentioned in relation to a report on Meta's new data center power consumption.
Research & Studies
- International Energy Agency report - Cited for data on data center electricity consumption projections.
- Recent report - Mentioned for global investment figures in AI and data centers.
- Data center watch report - Cited for the value of projects delayed or canceled due to community pushback.
People
- Casey Crownheart - Senior climate reporter at MIT Technology Review, discussed impacts and costs of data center construction.
- Ira Plato - Host of Science Friday.
- D Peter Schmidt - Producer of the episode.
- John Denkowski - Contributor to the episode.
- Danielle Johnson - Contributor to the episode.
- Bethy Amy - Contributor to the episode.
- Jackie Harshfeld - Contributor to the episode.
Organizations & Institutions
- Bear Science - Mentioned as a supporter of Science Friday, emphasizing rigorous processes.
- Alienware - Mentioned as a sponsor of Science Friday, promoting their sales and products.
- AT&T - Mentioned as a supporter of WNYC Studios, highlighting their holiday campaign.
- Meta - Mentioned for their new data center in Louisiana and its power requirements.
- Google - Mentioned for their Gemini model's energy consumption estimates and their 2030 net-zero energy goals.
- OpenAI - Mentioned for their ChatGPT model's energy consumption estimates.
- Microsoft - Mentioned for their water usage in chips and their purchase of energy usage from Three Mile Island.
- MIT Technology Review - Publication where Casey Crownheart is a senior climate reporter.
- WNYC Studios - Mentioned as a supporter of AT&T.
Websites & Online Resources
- sciencefriday.com - Mentioned in relation to Bear Science.
- alienware.com/deals - Mentioned for Alienware's sales on PCs, accessories, and displays.
Other Resources
- AI boom - Discussed as the driver for data center construction and its associated resource usage.
- Crypto mining - Used as a point of comparison for the scale of resource drain from data centers.
- Evaporative cooling - Discussed as a current method for cooling data centers that uses significant water.
- Direct liquid cooling - Mentioned as an alternative cooling technique for data centers.
- Immersion cooling - Mentioned as an alternative cooling technique for data centers.
- Net zero energy goals - Discussed in relation to tech companies' climate pledges.
- Renewables - Mentioned as a potential energy source for data centers.
- Nuclear power - Mentioned as a potential energy source for data centers, referencing efforts to reopen plants.
- Fossil fuels - Mentioned as an example of energy-consuming lifestyles with trade-offs.
- Digital infrastructure - Discussed as the context for AI's integration into daily life.