The World’s AI Generators: Rethinking Water Usage in Data Centers To Build a More Sustainable Future

by Matthew T. Ziegler, Director of HPC and AI Architecture and Performance at Lenovo
Mar 27, 2024 1:50 PM ET
Wind turbines behind a body of water. A setting sun behind them.

At NVIDIA GTC this week, the keynote address made it clear that data centers are set to grow in numbers and in terms of power consumption in order to provide AI capabilities to the world. Organizations that seek to modernize trillions of dollars’ worth of installed  IT equipment used to provide AI services over the next several years will need a facility to store and power their equipment  Accelerated computing will usher in a new era of greater efficiency through intelligence while requiring a steadfast commitment to providing smarter technology that builds a brighter, more sustainable future.

As industries around the globe seek to use AI for analyzing vast bodies of data, power efficiency remains crucial to making the rollout of these compute-intensive workloads accessible for all businesses. A key element to enabling this transformation? Water.

For more than a decade, Lenovo has been at the forefront of enabling efficient, high-power computing without compromise, pioneering Lenovo NeptuneTM water cooling technology and ranking  #1 on the Green500 list for cutting-edge designs powered by NVIDIA GPUs that accelerate computing capabilities while keeping things cool even in high-heat, multi-GPU environments. 

Water. What is the big deal?

Today is World Water Day 2024, a United Nations Observance that celebrates water and inspires action to tackle the global water crisis in support of water and sanitation for all by 2030. As climate change impacts increase, and populations grow, there is an urgent need, within and between countries, to unite around protecting and conserving our most precious resource.

The earth’s surface is composed of 71 percent water. Of that, less than one percent is available for human consumption. Most of it is either salt water, frozen in polar ice caps or inaccessible (i.e. too far under the earth’s surface to be extracted). The fresh water that is available for use isn’t just used for human consumption. Critical industries like agriculture and manufacturing rely on community water supplies. Data Centers also require water to perform critical functions like keeping IT equipment cool so that they don’t overheat. Comparing the percentage of water available to humans to the vast amount of water being used for data center cooling, manufacturing and agriculture you can quickly deduce that it is a precious and scarce commodity.

NPR  reports the average data center uses 300,000 gallons of water a day to keep cool, roughly equivalent to water use in 100,000 homes. Many data centers directly consume water on-site to remove the heat generated by the IT equipment. But we have a tremendous opportunity to drive greater computing and operational efficiency for today’s data centers: Lowering the overall electric consumption and coupling that with utilizing data center cooling that requires little to no water can have a dramatic impact on both indirect and direct water usage.

Cooling the Data Center

The solution to cooling massive data centers might seem as simple as installing conventional air conditioning equipment to cool and blow air through the data center. But while many organizations use this traditional method of data center cooling, it is the least energy efficient. Air-cooled data centers rely on large fan systems to move air to the IT equipment, so air ducts must be installed to deliver this cold air to the computer systems in the room. Smaller fan systems inside the equipment itself then use that cooler air to pull air in the front end and rejects air out the rear back into the data center. This process repeats for each IT system in the data center.  Dealing with the heat that is rejected back into the data center is where a large portion of water consumption occurs. Heated air is captured and exchanged from the air and into water delivery systems through large heat exchangers. This heated water is then sent outside the data center building where it is sprayed across cooling pads. Large fans blow air across these cooling pads thus evaporating the water into the surrounding environment as vapor. This water loss from the evaporation process must then be replenished from external sources. Traditional air cooling not only requires massive amounts of power (for air conditioning and air movement), but also a high amount of water if an organization selects the evaporative cooling method for their data center.  

In fact, data centers that rely on water for cooling consume millions of gallons a year, putting stress on local water sources. In areas such as the western United States, home to data centers operated by large tech companies, this poses a threat to an already drought-stricken region. The cost of installing and running these large air conditioning and air movement systems presents even more downsides for doing data center cooling the traditional way. Other methods, like evaporative cooling, blow air across cooling pads saturated with water to eject heat into the atmosphere. It requires warm water to move across a cooler-than-ambient wet media so the water evaporates and can consume several millions of gallons per year from community water supplies.

Using Warm Water to Keep Cool

Lenovo Neptune™ has led the world in data center cooling technology by using liquid to remove heat more efficiently than air cooling. The direct water-cooling solution recycles loops of warm water to cool data center systems, enabling customers to realize up to a 40% reduction in power consumption and a 3.5x improvement in thermal efficiencies compared to traditional air-cooled systems. The unique method keeps all the server components cool, reducing the need for power-hungry system fans in data center operation. 

At NVIDIA’s GTC event this week, Lenovo released a new ThinkSystem SR780a server that utilizes Lenovo Neptune™ to achieve an ultra-efficient power usage effectiveness (PUE) of 1.1. PUE is the ratio of the total amount of energy used by the data center facility overall to the energy used by just the computing equipment. By implementing direct water-cooling of CPUs, GPUs, and NVIDIA NVSwitch technology, this system can sustain maximum performance without reaching thermal limits. In fact, Lenovo Neptune™ delivers more computing power in a compact environment, increasing power efficiency without sacrificing performance.

As an industry metric used to determine the energy efficiency of a data center, PUE and power consumption are among the top-tracked sustainability methods, according to the “Uptime Institute Annual Global Data Center Survey 2021.” Because liquid cooling provides a more energy-efficient alternative to air, the system can drive higher sustained performance while consuming less energy. It also allows the ThinkSystem SR780a to fit in a dense 5U package, helping to conserve valuable data center real estate. 

Smarter Makes Better Use of Precious Resources

In data center cooling, every bit of water saved adds up fast. Water sourcing is an issue most organizations that operate data centers will be required to face head on. Selecting the best cooling method for a data center is more than just ensuring the hardware is kept cool. The choice extends into environmental impacts.

Both the traditional air-cooling and evaporative cooling methods result in water consumption – water that could have been distributed where it’s needed most in the community. Operating an energy efficient data center can also have a significant impact on water use. In addition to water that may be directly used within a data center, water is often indirectly used to produce the electricity required to power the data center itself. For example, it can take up to 11.86 gallons of water to produce just one kilowatt hour(kWh) of power through the use of fossil fuels (i.e. coal-fired power plants). This indirect water consumption can have a dramatic effect on the overall water consumption that is required to operate a data center when coupled with the direct usage of energy by the IT equipment.  Therefore, sourcing electricity through renewable sources can also help drive down indirect water usage when powering a data center. Additionally, non-potable water is also an alternative source that can be used for cooling helping large data centers conserve drinking water.

Using other cooling methods such as data center dry cooling and liquid cooling inside a server help conserve water and energy because there is no need for evaporative cooling methods. Lenovo Neptune™ liquid cooling offers innovative technologies designed to serve the dual purpose of keeping IT infrastructure cool and functional all while helping to reduce water use and waste.

An Obligation for Water Innovation

Lenovo’s Neptune™ water-cooling system was developed with business performance and energy efficiency as equal considerations. The enormous increases in data being generated in recent years have created unprecedented demand for accelerated computing power, at a time when an already limited water supply is being strained throughout the world. As we focus on technology solutions that help build a a more sustainable future a, Lenovo Neptune™ is designed to make data center computing as powerful and efficient as possible, empowering customers to meet their sustainability goals and exponentially grow generative AI and intelligence through infrastructure that doesn’t sacrifice efficiency for higher performance.