It’s no big secret that generative AI is resource-intensive. In fact, making a query with AI can cost “at least four or five times more computing per search” compared to traditional searches, according to Martin Bouchard, co-founder of Canada-based data center company QScale, who shared the calculation in a February article at Wired. That extra compute demands more electricity, but it can also require water.
That water consumption went into the spotlight this week in an Associated Press article from Matt O’Brien and Hannah Fingerhut, looking at the water used by Microsoft at its Iowa facility in West Des Moines. They reported that in July 2022 (right before Microsoft partner OpenAI wrapped up its training of GPT-4) Microsoft’s Iowa data centers used somewhere around 11.5 million gallons of water, based on information from the West Des Moines Water Works. That water gets used on hot days to help cool things down in and around Microsoft’s machines. The demands aren’t as intense as they would be in hotter climates, such as Arizona or Texas, but Iowa summers can still be intense.
This demand puts obvious pressure on a Microsoft goal to become net water positive, replenishing more water than it consumes, by 2030. The company isn’t alone in its compute-driven thirst, though. Google’s AI training has reportedly been hydrating more as well.
Still, Microsoft’s own environmental report showed a 34% increase in water consumption from its 2021 fiscal year to its 2022 fiscal year, reaching almost 1.7 billion gallons in the latter period.
As these costs climb, researchers are trying to estimate what AI’s environmental costs will be more broadly. The AP quoted an associate professor of computer science at the University of California, Riverside, Shaolei Ren, who is tackling this specific question. He got into the issue at length in an April interview with The Markup CEO Nabiha Syed, describing ChatGPT’s water needs as follows:
For inference (i.e., conversation with ChatGPT), our estimate shows that ChatGPT needs a 500-ml bottle of water for a short conversation of roughly 20 to 50 questions and answers, depending on when and where the model is deployed. Given ChatGPT’s huge user base, the total water footprint for inference can be enormousShaolei Ren
There are different ways to frame these needs of course. Blockchain- and cryptocurrency mining-related computing have also received a great deal of attention for their electricity costs. Whatever the task being performed, however, it’s important to view these operations in context along with local environmental impact, cumulative global impact, and efforts to mitigate or balance out the impact.
Regardless, high-performance computing for increasingly lofty goals at scale means that answers and solutions are necessary and worth pursuing, especially where shared natural resources are concerned. After all, people need water to drink, too.
Featured image for this post generated using Midjourney