The adoption of liquid cooling in data centers is on the rise. But interest in advanced cooling is no longer just about density and “hot hardware,” according to our panel of data center experts, who cite a confluence of factors that includes sustainability and edge deployments.
The progress of liquid cooling is just one of the topics we’ll examine this week in the Data Center Frontier Executive Roundtable, which features the insights of six thought leaders on the state of the data center industry. Our Second Quarter 2022 roundtable will also explore several other relevant topics: how cloud and edge architectures are shaping customer deployment options, the role of microgrids in the future of digital infrastructure, and the data center industry’s progress on workplace diversity and inclusion.
Here’s a look at our distinguished panel.
- Shannon Hulbert, CEO and Co-Founder of cloud computing specialist Opus Interactive.
- Rob Rockwood, the President of data center developer Sabey Data Centers.
- Nancy Novak, Chief Innovation Officer at Compass Datacenters and Advisory Council member for Infrastructure Masons.
- Steven Carlini, Vice President of Innovation and Data Center, with the Energy Management Business Unit of Schneider Electric.
- Brad Furnish, Vice President of Global Sales and Marketing for TMGCore, a specialist in cooling high-density IT workloads.
- Phillip Marangella, Chief Marketing Officer for data center developer and operator EdgeConneX.
Each day this week I’ll moderate a Q&A with these executives on one of our key topics. We begin with our panel’s take on the state of liquid cooling.
Data Center Frontier: Is liquid cooling gaining traction? What are the key factors that will guide whether liquid cooling technologies see greater adoption?
Phillip Marangella, EdgeConneX: If we look at the two primary modes of liquid cooling – direct-to-chip cooling and immersion cooling – we can understand why there is a lot of excitement around these options. Chip fabricators are increasing transistor densities, generating more heat, and customers are looking for higher rack densities in data centers as they implement machine learning, AI, gaming, Web3.0 worlds and applications, and other compute-intensive services.
Direct-to-Chip cooling appears to hold more near-term promise because it is better suited to retrofitting into existing data center architecture and can be applied to CPUs and GPUs. Immersion cooling has its own compelling use cases, but because it cools all sources of the power draw on IT hardware, it requires more specialized equipment, making it more challenging to introduce into existing data centers.
As for traction, announcements like NVIDIA’s recent liquid-cooled GPU are a sign that liquid cooling has generated serious interest from customers looking for greener, more efficient data center services. Immersion cooling is on a trajectory to reduce costs and improve the technology, but in the near term direct-to-chip liquid cooling is likely to gain significantly more traction.
Brad Furnish, TMGCore: Yes, Liquid cooling is gaining adoption. There are several factors as to why liquid cooling is being adopted. The first is thermodynamic restrictions of air. There are several industries where the compute density requirements are pushing air cooling to its limits and/or air cooling is preventing the optimal performance and lifespan of certain chipsets/server configurations.
Another reason as to why liquid cooling is being adopted is due to end-users needing compute in non-traditional locations and at the edge. The Edge has many different definitions, and every definition boils down to needing compute/processing at that specific location, whether it be mobile, on an oil rig, on a mountain, an extreme climate, forward operating base, etc..
Another big component is the need for more green solutions. Liquid cooling operates at significantly better efficiencies than traditional air cooling and having the ability to be a better steward of power utilization and significantly shrink footprints while increasing computing power is driving the adoption of liquid cooling.
Lastly, and potentially most importantly, is chipsets and server architectures are being designed for higher densities and within the next few years the chipsets and servers will have TDP’s that exceed the thermodynamic limits of air cooling and therefore leaving the only alternatives to be liquid cooling solutions.
Steven Carlini, Schneider Electric: Liquid cooling is the most effective way to cool rising chip densities that are seen in applications like machine learning and AI servers. It proposes a more efficient alternative to air cooling and decreases the overall carbon footprint. Factors impacting adoption include rack densities, pressure to reduce energy consumption, space constraints, water usage restrictions, and harsh IT environments.
Currently, there are five competing liquid cooling technologies with varying degrees of complexity and effectiveness. I predict one or two technologies may emerge as leaders in the next five years, which would benefit economies of scale and pave the way to broader acceptance.
Shannon Hulbert, Opus Interactive: The market for liquid cooling is forecasting a CAGR of 20% annually to reach 7 billion by 2028. The demand for liquid cooling is driven by data growth and increased energy use. As data and energy consumption grow, the need to find ways to improve efficiencies increases. Right now, global data traffic more than doubles every 4 years. A billion more people coming online in developing countries, the Internet of Things (IoT), driverless cars, robots, and artificial intelligence (AI) is why US researchers expect global power consumption to triple in the next 5 years.
Today, datacenters consume close to 2% of the world’s energy. It’s estimated that the information and communications technology (ITC) sector will use 20% of all the world’s electricity by 2025 – emitting up to 5.5% of all carbon emissions. This is driving a search for alternative energy sources as well as improvement of energy efficiency across the space from edge to cloud.
Similar to the way the data center and cloud industries are moving to hybrid approaches, different regions and different use cases will likely determine where and how we see adoption. Considerations might be, warm regions vs cold, high energy cost vs lower, high performance compute vs cold storage. Additionally, the technology of the liquid cooling is happening at multiple levels from facility to equipment to architecture.
Nancy Novak, Infrastructure Masons: I think we’re seeing an uptick in the acceptance and use of liquid cooling, but this incremental rather than step function. The current economics aren’t supportive of mass adoptions at this point.
That being said, we are seeing users with large processing workloads identifying the technology as a cost-effective means of controlling heat rejection costs. The fact that data volumes continue to grow at an exponential rate only bodes well for greater use of liquid cooling.
To a certain extent, I think the popularity of the technology will be pulled along, versus being a primary driver, by a number of factors including grid stability, compatible hardware, more skilled personnel as arising necessities to support ever larger workloads.
Rob Rockwood, Sabey Data Centers: Liquid cooling is gaining traction, but necessity will drive adoption, rather than improvements in liquid cooling technology.
Increasingly, entities will optimize processing for profitable and fast-growing applications by increasing watts per square foot. As those ratios begin to exceed air cooling capabilities, liquid cooling will become the only option.
NEXT: How cloud and edge architectures are shaping customer deployment options.
Keep pace with the fact-moving world of data centers and cloud computing by following us on Twitter and Facebook, connecting with DCF on LinkedIn, and signing up for our weekly newspaper using the form below: