- Profitable Pathways
- Posts
- How AI Is Draining Energy and Water Resources
How AI Is Draining Energy and Water Resources
Powering AI systems requires far more than just electricity—it’s also draining our planet’s water resources.
Hello and welcome back to profitable pathways!
As the demand for AI continues to surge, the infrastructure powering it is facing increasing scrutiny—especially when it comes to energy and water use.
GPUs, the chips that power tools like ChatGPT, are significantly more energy-intensive than CPUs. As a result, they generate a lot of heat and require constant cooling. To keep these systems running efficiently, data centers must maintain optimal temperatures—often by relying heavily on water-based cooling systems.
An investigation by Source Material and The Guardian revealed that the three largest hyperscalers—Microsoft, Amazon, and Google—are operating highly water-intensive data centers, with plans to build even more. Globally, there are around 12,000 operational data centers, with about half located in the U.S. Amazon alone is planning several new facilities in northern Spain, a region already grappling with the growing threat of desertification.
These cooling systems consume vast amounts of water, much of which evaporates and may never return to the ecosystem—a practice that raises sustainability concerns in the face of global water scarcity.
As we embrace the benefits of AI, it’s crucial to also consider its environmental footprint—and push for more sustainable innovation in the tech industry.
Cheers!
Maham