Liquid Cooling – A Better Solution for Data Center Cooling

Nov. 2, 2016
Your data center needs a better way to cool off. Now, liquid cooling technologies can take the heat off of your data center and introduce next generation levels of optimization.

New data center optimization technologies are changing the way administrators deploy cooling and control platforms. Liquid cooling isn’t just a system to optimize data center cooling and server performance—it’s a way to revolutionize data center economics in general. These are tested, proven systems that help a data center support some of the world’s most advanced use-cases.

Although there will still be a place for traditional, airflow-based, cooling methodologies, many organizations are now looking into liquid cooling for their servers and overall data center environments. Most of all, high density computing, like those used for high performance computing (HPC) and big data, require a new way to remain efficient. In this special report from Ebullient, we learn about new use-cases around liquid cooling and how this technology impacts modern data center deployments.

Download the Data Center Frontier Special Report on Liquid Cooling

In the World of Liquid Cooling – Why “Engineered Fluid” is a Better Option Than “Water”

Whenever the concept of liquid cooling is brought up, too often engineers immediately assume that this means water. However, in the world of modern workloads, HPC, big data, and high-density computing, using water within your cooling architecture may not be the best option.

In working with precision cooling systems, engineers are able to deliver a next-generation direct liquid cooling technology that eliminates conventional liquid cooling system risks and further improves efficiency. Rather than warming a fluid or even using traditional water, the Ebullient system absorbs heat by vaporizing a dielectric, non-toxic, non-flammable engineered fluid from 3M within sealed modules mounted directly to the hottest server devices (typically the processors and co-processors). From there, you can cool powerful systems used for HPC, big data processing, and even GPU-based workloads. This includes cooling systems using the compute power of the NVIDIA Tesla K80 Graphics Cards.

A Data Center Cooling Study

A recent study determined power consumption, water consumption, and facility design all benefit from deploying a two-phase Ebullient DirectJet cooling system in an Open Compute data center. To obtain field data, Ebullient instrumented Winterfell class Open Compute servers and measured baseline cooling energy consumption and IT load in the stock, air-cooled configuration. Ebullient then installed its two-phase cooling system on the servers and again measured cooling energy consumption and IT load. Based on the study results, Ebullient recommends a hybrid data center cooling strategy featuring the Ebullient cooling system for primary IT heat load matched to an adiabatic cooler.

From there, if needed, deploying CRAHs within the data center for residual heat load. In a few very warm and humid ambients, traditional precision cooling can be used for the residual load, or even non-precision building HVAC components, such as roof-top units.

To put some numbers to it, in a standard open compute data center operating 18,000 Winterfell-class servers, a hybrid Ebullient-CRAC cooling strategy would:

  • Reduce cooling energy consumption by 80%.
  • Reduce overall annual power consumption by 10M kWh
  • Operate with a Power Utilization Effectiveness (PUE) of 1.10.

Importantly, Ebullient cooling systems enable Open Compute data centers to enjoy significantly lower data center build-out costs by decoupling cooling system and structure costs. Ebullient deploys its cooling systems through modular, standard, rack-sized pump stands connected to the building’s secondary liquid loop. Smaller facilities can realize the same performance, efficiency and space benefits with linearly scaled deployment costs.

So, how does this ultimately compare to more traditional cooling systems? In the study, based on a model developed by Iyengar and Schmidt, a completely air-cooled system required 35 CRACs and a water loop operating at 1.43 MW of cooling power to move an IT thermal load of 2.92 MW  (PUE of 1.49). With an Ebullient cooling system in place to offload 70% of the total server thermal load in the form of two-phase cooling, the number of CRACs can be reduced to 10 (from 35) thus only requiring 288 kW to move the same 2.92 MW of IT load (total PUE of 1.10). This represents an approximately 80% cooling power reduction (1.43 MW to 288 kW).

Download this special report today to learn more about engineered fluid and where this can impact your data center ecosystem. Remember, liquid cooling has emerged as a powerful technology to introduce greater levels of efficiency and lower the PUE of a data center. Now, find out how this all works; and where it can impact your data center cooling strategy.

About the Author

Bill Kleyman

Bill Kleyman is a veteran, enthusiastic technologist with experience in data center design, management and deployment. Bill is currently a freelance analyst, speaker, and author for some of our industry's leading publications.

Sponsored Recommendations

A better approach to boost data center capacity – Supply capacity agreements

Explore a transformative approach to data center capacity planning with insights on supply capacity agreements, addressing the impact of COVID-19, the AI race, and the evolving...

How Modernizing Aging Data Center Infrastructure Improves Sustainability

Explore the path to improved sustainability in data centers by modernizing aging infrastructure, uncovering challenges, three effective approaches, and specific examples outlined...

How Modern DCIM Helps Multi-Tenant Colocation Data Centers Be More Competitive

Discover the transformative impact of modern DCIM software on multi-tenant colocation data centers, enhancing competitiveness through improved resiliency, security, environmental...

3 Steps to Calculate Total Enterprise IT Energy Consumption Using DCIM

Embark on a simplified journey to measure and reduce the environmental impact of your enterprise IT with our practical guide, outlining a straightforward 3-step framework using...


Unpacking CDU Motors: It’s Not Just About Redundancy

Matt Archibald, Director of Technical Architecture at nVent, explores methods for controlling coolant distribution units (CDU), the "heart" of the liquid cooling system.

White Papers

Chatsworth Cover 2023 08 07 11 57 53

The Data Center Innovation Will Change the Way You Think About Liquid Cooling

Aug. 7, 2023
The demand for high density servers and high-performance computing continues to grow – as does the amount of heat generated by all this computing power. Data center operators ...