Why Liquid Cooling is More Sustainable

May 20, 2024
Stuart Lawrence of Stream Data Centers explains that because liquid is a more efficient means of heat transfer than air, liquid cooling will be necessary to support the next generation of processors; it will also reduce the data center’s Scope 2 and 3 emissions.

Air as the fundamental cooling medium is still the most common means to take heat out of a data center. Yet at some point in the not-distant future, heat fluxes for the most powerful processors will be too high to manage with direct air cooling. When it comes to heat transfer, liquids are fundamentally more efficient than air—because they are denser, have higher specific heat capacities, and lower thermal resistance.

Liquid cooling will be necessary to support the next generation of processors. There’s a significant added benefit as well: liquid cooling is much more sustainable.

Sustainability is no longer only about reducing PUE and WUE. Now the focus is on reducing Scope 1, 2, and 3 greenhouse gas emissions, which is a more holistic way of thinking about impact across the value chain—direct emissions from power generation, indirect emissions from power consumption, and indirect emissions from value chain activities (essentially, embedded carbon).

Most hyperscalers and third-party data center providers have committed to GHG emissions reductions targets that are in line with limiting global warming to well-below 2.5°C above preindustrial levels by 2050. Liquid cooling will help.

More heat transfer per unit of transport energy reduces Scope 2 emissions

Because water is denser, has a higher specific heat capacity, and a lower thermal resistance, heat can be removed with dramatically less volumetric flow via water than air. With water we can accomplish more heat transfer for the same mechanical transport energy. Reducing the amount of energy required facilitates a reduction in Scope 2 emissions (indirect emissions associated with the purchase of electricity).

Liquid cooling also eliminates the inherent motor heat inefficiencies associated with air cooling. When locating air handling equipment in the data halls, fan motor heat must be factored in to ensure the desired supply air setpoint is achieved. With liquid cooling, in contrast, the pump motor heat is not added to the working fluid and can be decoupled and addressed separately, outside the data hall—again reducing the amount of energy required, and therefore reducing Scope 2 emissions. 

Less MEP infrastructure and smaller buildings reduces Scope 3 emissions

In addition to requiring more energy, moving greater volumes of air requires more ductwork and supporting plumbing infrastructure than water. Conversely, with liquid cooling the same heat load can be moved with dramatically less piping infrastructure. The piping infrastructure, pumps, and support infrastructure requirement associated with liquid cooling can be further reduced by expanding the ΔT for a constant heat load.

Liquid cooling reduces the electrical infrastructure requirements as well. With air, 10-20% of the total server power is used for cooling. Power distribution equipment, low voltage switchgear, UPS, batteries, and even generators are sized to carry that load. In contrast, pumps that transport the heat away with liquid are typically shared by dozens of servers and even many racks—dramatically reducing the energy use and the size of equipment needed.

With the increase in density and reduction in real estate needed for MEP equipment that liquid cooling allows for, we can also right size the buildings—reducing the building footprint per kW. Altogether, these efficiencies can dramatically reduce Scope 3 emissions (indirect GHG emissions associated with the value chain).

Increased economization reduces Scope 2 (and possibly Scope 3) emissions

To get an 80°F supply air temperature we typically target 70°F chilled water to cooling coils. For liquid cooling, in many cases, we can make 80°F (W27) facility water, and in some cases higher (over 100°F). [Note: With increasing thermal design power (TDP) and chip architectures that require a reduction in Tcase temperature, CHW temperatures will likely stay cooler than initially projected. At this time, supply conditions between 80°F (27°C) and 95°F (35°C) with a ΔT of 10°C are most common.] Raising facility water temperatures means our chillers don’t have to work as hard and we can economize (use outdoor ambient to reject heat) many more hours of the year for a given location.

That means we save a lot of energy, and we can often reduce the size or eliminate compressors altogether (depending on location, available heat transfer surface area and peak ambient design conditions). Thus, raising chilled water temperatures reduces Scope 2 emissions (and Scope 1 during utility disruptions) and could reduce Scope 3 emissions as well.

Bottom line

While the move to liquid cooling to the rack is being driven by next-generation processors that consume much more power that results in much higher heat flux, the added benefit is its sustainability. Liquid cooling to the rack facilitates modern workloads, and affords the opportunity for data center operators and their tenants to dramatically reduce Scope 2 and Scope 3 emissions as well. That’s quite a win-win.

About the Author

Stuart Lawrence

Stuart Lawrence is VP of Product Innovation & Sustainability at Stream Data Centers, which builds and operates for the largest and most sophisticated hyperscalers and enterprises – 24 data centers since 1999, with 90% of our capacity leased to the Fortune 100.

Sponsored Recommendations

Get Utility Project Solutions

Lightweight, durable fiberglass conduit provides engineering benefits, performance and drives savings for successful utility project outcomes.

Guide to Environmental Sustainability Metrics for Data Centers

Unlock the power of Environmental, Social, and Governance (ESG) reporting in the data center industry with our comprehensive guide, proposing 28 key metrics across five categories...

The AI Disruption: Challenges and Guidance for Data Center Design

From large training clusters to small edge inference servers, AI is becoming a larger percentage of data center workloads. Learn more.

A better approach to boost data center capacity – Supply capacity agreements

Explore a transformative approach to data center capacity planning with insights on supply capacity agreements, addressing the impact of COVID-19, the AI race, and the evolving...

Photon photo/Shutterstock.com

In the Age of Data Centers, Our Connected Future Still Needs Edge Computing

David Wood, Senior Product Manager – Edge Computing at nVent, explains why edge computing will play a pivotal role in shaping the future of technology.

White Papers

Get the full report.

From Console to Cloud

April 7, 2022
This white paper from Iron Mountain explores the current challenges, drivers, and opportunities for gaming digital infrastructure.