Executive Roundtable: Cooling Imperatives for Managing High-Density AI Workloads

We reached out to our panel of seasoned industry experts to gather their insights on the intricacies of emerging data center cooling technologies and efficiency strategies being deployed to ensure operational stability and sustainability, while effectively managing the challenges posed by high-density computing environments.
March 19, 2025
6 min read

For the second installment of our Executive Roundtable for the First Quarter of 2025, we asked our panel of seasoned industry experts for their perspective on the nuances of new cooling technologies and efficiency strategies being deployed to maintain operational stability and sustainability for managing high-density artificial intelligence (AI).

The rapid rise of AI workloads has significantly impacted data center operations, particularly in terms of power consumption and thermal management. AI applications, especially those involving large-scale training models, demand substantial computational power, leading to increased energy usage and heat generation. Projections indicate that AI workloads could account for 15% to 20% of total data center energy consumption by 2028, presenting challenges for existing power and cooling infrastructures. 

And while viable for a certain embedded range of workloads, traditional air-cooling methods are simply inadequate for managing the heat produced by high-density AI servers, especially as rack power densities exceed 40 kilowatts (kW). To address these challenges, data centers are adopting advanced cooling technologies (read: liquid) and efficiency strategies to maintain operational stability and sustainability. 

Given this context, with AI-driven workloads pushing power densities beyond traditional limits, we asked our industry expert roundtable for their opinions what new cooling technologies and efficiency strategies they see most effectively being deployed to maintain data center operational stability and sustainability?

The seasoned data center industry leaders of our Executive Roundtable for the First Quarter of 2025 include:

  • Danielle Rossi, Data Center Strategic Sales Leader, Trane
  • John Pasta, Executive Vice President - Data Center Solutions, JLL, Inc.
  • Michael Lahoud, Co-Managing Partner, Stream Data Centers
  • Ryan Baumann, Vice President of Sales, Power Solutions for the Americas, Rehlko

And now, onto the second DCF Executive Roundtable question for Q1 of 2025.

Data Center Frontier:  With the rapid rise of AI-driven workloads pushing power densities beyond traditional limits, what new cooling technologies and efficiency strategies are being deployed to maintain operational stability and sustainability?

Danielle Rossi, Trane:  With the increased densities required by AI comes the increased requirement of liquid cooling and its associated design demands. 

We have seen the influx of liquid cooling manufacturers and technologies in the market over the last few years, but most have been relatively siloed.

There are many different types of liquid cooling technologies, and each can have different requirements for heat rejection. 

Moving forward, the water system of a data center (closed- or open-loop) needs to be designed holistically. Performing at peak efficiency is challenging without the water system being designed and optimized as one system from roof to rack. 

This may include multiple loop and heat rejection considerations and requires expert mechanical design customized for each installation. 

Earlier, cohesive vendor engagements in the full system design should become a more prevalent strategy to best approach these deployments.

John Pasta, JLL: The rise of high-density AI workloads is pushing data center cooling systems to their limits. Traditional air-based cooling is insufficient beyond 40-50 kW, prompting a widespread adoption of liquid cooling technologies.

Solutions like direct-to-chip cooling, where coolant is delivered directly to the processors, and rear door heat exchangers, which use liquid to remove heat at the rack level, are becoming standard. 

Liquid cooling also offers significant energy efficiency benefits. Compared to air cooling, it requires less power to achieve the same temperature control, reducing overall electricity consumption. 

This aligns with sustainability goals by lowering the carbon footprint of cooling operations and enabling innovative practices like heat reuse—where captured heat is repurposed for nearby heating needs, such as office spaces or district heating systems. 

Michael Lahoud, Stream Data Centers: For the past two years, Stream Data Centers has been developing a modular, configurable air and liquid cooling system that can handle the highest densities in both mediums. Based on our collaboration with customers, we see a future that still requires both cooling mediums, but with the flexibility to deploy either type as the IT stack destined for that space demands. With this necessity as a backdrop, we saw a need to develop a scalable mix-and-match front-end thermal solution that gives us the ability to late bind the equipment we need to meet our customers’ changing cooling needs.

It’s well understood that liquid far outperforms air in its ability to transport heat, but further to this, with the right IT configuration, cooling fluid temperatures can also be raised, and this affords operators the ability to use economization for a greater number of hours a year. These key properties can help reduce the energy needed for the mechanical part of a data center’s operations substantially. 

It should also be noted that as servers are redesigned for liquid cooling and the onboard server fans get removed or reduced in quantity, more of the critical power delivered to the server is being used for compute. This means that liquid cooling also drives an improvement in overall compute productivity despite not being noted in facility PUE metrics. 

Counter to air cooling, liquid cooling certainly has some added management challenges related to fluid cleanliness, concurrent maintainability and resiliency/redundancy, but once those are accounted for, the clusters become stable, efficient and more sustainable with improved overall productivity.

Ryan Baumann, Rehlko: AI workloads put a huge strain on power infrastructure, making backup power essential for data centers to avoid downtime during grid disruptions or peak demand periods. 

Backup generators are a go-to solution because they’re dependable and built to handle high-power loads when it matters most. Diesel-powered systems, in particular, deliver fast, high-density power, ensuring data centers stay up and running—even under the intense demands of AI applications.

To boost generator efficiency while also moving toward cleaner energy solutions, many data centers are adopting smarter maintenance strategies. We introduced our Conscious Care program to cut fuel use and lower costs while keeping operations running smoothly. 

By letting operators run emergency generators at no load and extending the load interval to every four months, the program helps reduce fuel consumption, air and noise pollution, greenhouse gas emissions, and overall energy expenses.

By implementing more efficient maintenance strategies, data centers can strengthen their power reliability and operational stability while making real progress toward sustainability and efficiency.

 

Next: Data Center Site Selection and Market Evolution in a Constrained Environment

 

Keep pace with the fast-moving world of data centers and cloud computing by connecting with Data Center Frontier on LinkedIn, following us on X/Twitter and Facebook, as well as on BlueSky, and signing up for our weekly newsletters using the form below.

About the Author

Matt Vincent

Matt Vincent is Editor in Chief of Data Center Frontier, where he leads editorial strategy and coverage focused on the infrastructure powering cloud computing, artificial intelligence, and the digital economy. A veteran B2B technology journalist with more than two decades of experience, Vincent specializes in the intersection of data centers, power, cooling, and emerging AI-era infrastructure. Since assuming the EIC role in 2023, he has helped guide Data Center Frontier’s coverage of the industry’s transition into the gigawatt-scale AI era, with a focus on hyperscale development, behind-the-meter power strategies, liquid cooling architectures, and the evolving energy demands of high-density compute, while working closely with the Digital Infrastructure Group at Endeavor Business Media to expand the brand’s analytical and multimedia footprint. Vincent also hosts The Data Center Frontier Show podcast, where he interviews industry leaders across hyperscale, colocation, utilities, and the data center supply chain to examine the technologies and business models reshaping digital infrastructure. Since its inception he serves as Head of Content for the Data Center Frontier Trends Summit. Before becoming Editor in Chief, he served in multiple senior editorial roles across Endeavor Business Media’s digital infrastructure portfolio, with coverage spanning data centers and hyperscale infrastructure, structured cabling and networking, telecom and datacom, IP physical security, and wireless and Pro AV markets. He began his career in 2005 within PennWell’s Advanced Technology Division and later held senior editorial positions supporting brands such as Cabling Installation & Maintenance, Lightwave Online, Broadband Technology Report, and Smart Buildings Technology. Vincent is a frequent moderator, interviewer, and keynote speaker at industry events including the HPC Forum, where he delivers forward-looking analysis on how AI and high-performance computing are reshaping digital infrastructure. He graduated with honors from Indiana University Bloomington with a B.A. in English Literature and Creative Writing and lives in southern New Hampshire with his family, remaining an active musician in his spare time.

You can connect with Matt via LinkedIn or email.

You can connect with Matt via LinkedIn or email.

Sign up for our eNewsletters
Get the latest news and updates
ZinetroN/Shutterstock.com
Source: ZinetroN/Shutterstock.com
Sponsored
Michael Lawrence of Leviton outlines four key subsystems that often required tailored solutions in an AI data center and the challenges data centers face with AI builds: the Entry...
Image courtesy of Integrated Environmental Solutions
Image courtesy of Integrated Environmental Solutions
Sponsored
Mark Knipfer of Integrated Environmental Solutions (IES), explains why data center cooling strategies should be designed for reality, not extremes.