Executive Roundtable: The Integration Imperative
For the third installment of our Q3 2025 Executive Roundtable, we focus on "the integration imperative," i.e. how the convergence of thermal, water, and power systems is breaking down old silos and reshaping organizational strategies.
With AI infrastructure driving unprecedented density and scale, efficiency can no longer be achieved by optimizing systems in isolation. Cooling, water management, and energy use are now deeply interdependent, demanding new approaches that cut across traditional boundaries.
To understand how the industry is responding, we asked our panel of experts how their customers are rethinking integration and what role their companies are playing in enabling it. Their perspectives highlight how this shift is unfolding today, and how integrated design could set the standard for data center operations in the years ahead.
Our distinguished panelists this quarter bring deep expertise spanning direct-to-chip liquid cooling, advanced water management and chemistry, and integrated HVAC and thermal control systems. They include:
- Brandon Peterson, Senior Vice President of Business Development, CoolIT Systems
- Mukul Girotra, Senior Vice President and General Manager, Global High-Tech Division, Ecolab
- Becky Wacker, Vice President – Data Center Solutions, Trane
The panel offers a clear view of how integration is reshaping the data center. Brandon Peterson of CoolIT Systems points out that liquid cooling links IT and facilities more tightly than ever, demanding real-time management of flow rates and resiliency. Mukul Girotra of Ecolab argues that AI makes siloed engineering untenable, highlighting digitally enabled solutions that balance water and energy use across the stack. Becky Wacker of Trane emphasizes how AI magnifies complexity, and that integrated designs with advanced controls and waste heat recovery are key to scaling efficiency and reliability.
And now onto our third Executive Roundtable question of the quarter.
Data Center Frontier: How are your customers rethinking the integration of thermal, water, and power systems as AI infrastructure scales, and what role is your company playing in breaking down legacy silos between them?
Silos between thermal, water and power systems still exist and often extend to IT teams, as liquid cooling creates a tighter link between IT and facility infrastructure.
In the past, maintaining an average kilowatt rating per rack along with standard ASHRAE air temperature guidelines were typically sufficient to keep IT equipment within required thermal limits.
Today, data centers must consider water temperatures, flow rates, pressure levels, system resiliency and redundancy. In an air-cooled data center, a cooling system failure may result in a gradual temperature rise, giving IT equipment several minutes or even hours before throttling or shutdown is needed.
In contrast, liquid-cooled environments require operators to monitor and manage system parameters at a much finer time scale (often within seconds) to avoid thermal excursions.
CoolIT helps break down these silos by: (1) designing products and services with stringent customer requirements in mind; (2) providing system-level expertise across the entire liquid cooling chain from cold plates to CDUs; and (3) identifying and mitigating complexities and risks based on years of hands-on experience.
While the liquid cooling market has grown rapidly with many new entrants over the past two years, CoolIT has been deploying liquid cooling in the data center at scale since 2017.
That experience is embedded in our systems and processes, allowing us to help customers avoid risks and solve challenges we've already encountered and overcome firsthand.
Mukul Girotra, Ecolab: The AI infrastructure revolution is forcing a complete rethinking of how thermal, water, and power systems interact. It’s breaking down decades of siloed engineering approaches that are now proving inadequate given the increased rack demands.
Traditionally, data centers were designed with separate teams managing power, cooling, and IT equipment. AI scale requires these systems to operate holistically, with real-time coordination between power management, thermal control, and workload orchestration.
Here’s how Ecolab is addressing integration:
We extend our digitally enabled approach from site to chip, spanning cooling water, direct-to-chip systems, and adiabatic units, driving cleanliness, performance, and optimized water and energy use across all layers of cooling infrastructure.
Through collaborations like the one with Digital Realty, our AI-driven water conservation solution is expected to drive up to 15% water savings, significantly reducing demand on local water systems.
Leveraging the ECOLAB3D™ platform, we provide proactive analytics and real-time data to optimize water and power use at the asset, site and enterprise levels, creating real operational efficiency and turning cooling management into a strategic advantage.
We provide thermal, hydro and chemistry expertise that considers power constraints, IT equipment requirements, and day-to-day facility operational realities. This approach prevents the sub-optimization that can occur when these systems are designed in isolation.
Crucially, we view cooling through the lens of the water-energy nexus: choices at the rack or chiller level affect both Power Usage Effectiveness (PUE) and Water Usage Effectiveness (WUE) of a data center, so our recommendations balance energy, water, and lifecycle considerations to deliver reliable performance and operational efficiency.
The companies that will succeed in AI infrastructure deployment are those that abandon legacy siloed approaches and embrace integrated thermal management as a core competitive capability.
Becky Wacker, Trane: Data centers are inherently complex with a multitude of demands and systems, and the scale of requirements for AI data centers significantly amplifies these challenges. It’s a complex ecosystem where thermal management, water, and power systems integrate to flex and scale with the variability of compute power demands.
Traditionally, these systems could be thought of in their silos, but this limits the capabilities of the technologies and efficiencies that could be found.
Breaking down silos requires thinking about the key objectives of the data center such as availability of power, fast response to power and cooling demands, reliability and longevity of systems, resiliency to utility interruptions, all driving toward minimizing downtime and achieving consistent profitability.
Our customers are rethinking this integration to achieve unified, scalable designs that enhance efficiency and reliability. Trane plays a pivotal role in breaking down legacy silos by offering integrated solutions that seamlessly connect cooling, power, and water systems. We have an extensive portfolio of Thermal Management solutions, including advanced control platforms that enable centralized management and coordination of these systems, ensuring they work harmoniously to support AI workloads.
We collaborate closely with our customers to design and implement bespoke solutions that meet their specific needs, leveraging our expertise in thermal management, energy efficiency, and water conservation to drive innovation and integration.
Additionally, we have incorporated a number of different waste heat recovery solutions to further enhance energy efficiency and sustainability, turning potential waste into a valuable resource. This approach is a performance multiplier. It not only optimizes the performance of data centers but also contributes to achieving sustainability goals.
By focusing on these integrated solutions, Trane helps data center operators rethink the traditional silos and adopt a holistic approach to managing thermal, water, and power systems. This helps ensure that data centers can flex and scale efficiently with the demands of AI infrastructure, maintaining reliability, minimizing downtime, and achieving consistent profitability.
About the Author
Matt Vincent
A B2B technology journalist and editor with more than two decades of experience, Matt Vincent is Editor in Chief of Data Center Frontier.