Cooling Consolidation Hits AI Scale: LiquidStack, Submer, and the Future of Data Center Thermal Strategy

Trane’s acquisition of LiquidStack and Submer’s latest expansion moves signal a new phase in AI data center growth, where liquid cooling and integrated thermal strategies are becoming central to deploying high-density, AI-scale infrastructure worldwide.
Feb. 11, 2026
11 min read

Key Highlights

  • Trane Technologies' acquisition of LiquidStack signifies a shift toward comprehensive thermal management, from plant-level systems to chip-level cooling solutions.
  • Oracle's adoption of closed-loop, direct-to-chip cooling reduces water consumption, addressing community and environmental concerns associated with traditional evaporative systems.
  • Submer is transforming from a cooling specialist into a full-stack AI infrastructure provider, integrating modular data centers, GPU compute, and sovereign cloud services.
  • Partnerships like Submer's with Anant Raj accelerate the deployment of scalable, liquid-cooled AI data centers in India, supporting national AI sovereignty and ecosystem growth.
  • The industry is moving toward integrated, accountable solutions that combine cooling, modular infrastructure, and cloud platforms, shaping the future of AI data center deployment.

As AI infrastructure scales toward ever-higher rack densities and gigawatt-class campuses, cooling has moved from a technical subsystem to a defining strategic issue for the data center industry.

A trio of announcements in early February highlights how rapidly the cooling and AI infrastructure stack is consolidating and evolving: Trane Technologies’ acquisition of LiquidStack; Submer’s acquisition of Radian Arc, extending its reach from core data centers into telco edge environments; and Submer’s partnership with Anant Raj to accelerate sovereign AI infrastructure deployment across India.

Layered atop these developments is fresh guidance from Oracle Cloud Infrastructure explaining why closed-loop, direct-to-chip cooling is becoming central to next-generation facility design, particularly in regions where water use has become a flashpoint in community discussions around data center growth.

Taken together, these developments show how the industry is moving beyond point solutions toward integrated, scalable AI infrastructure ecosystems, where cooling, compute, and deployment models must work together across hyperscale campuses and distributed edge environments alike.

Trane Moves to Own the Cooling Stack

The most consequential development comes from Trane Technologies, which on February 10 announced it has entered into a definitive agreement to acquire LiquidStack, one of the pioneers and leading innovators in data center liquid cooling.

The acquisition significantly strengthens Trane’s ambition to become a full-service thermal partner for data center operators, extending its reach from plant-level systems all the way down to the chip itself.

LiquidStack, headquartered in Carrollton, Texas, built its reputation on immersion cooling and advanced direct-to-chip liquid solutions supporting high-density deployments across hyperscale, enterprise, colocation, edge, and blockchain environments. Under Trane, those technologies will now be scaled globally and integrated into a broader thermal portfolio.

In practical terms, Trane is positioning itself to deliver cooling across the full thermal chain, including:

• Central plant equipment and chillers.
• Heat rejection and controls systems.
• Liquid distribution infrastructure.
• Direct-to-chip and immersion cooling at the server level.

Holly Paeper, President of Commercial HVAC Americas at Trane Technologies, framed the shift clearly:

“Rising chip-level power and heat densities combined with increasingly variable workloads are redefining thermal management requirements inside modern data centers. Customers need integrated cooling solutions that scale from the central plant to the chip and can adapt as performance demands continue to evolve.”

LiquidStack co-founder and CEO Joe Capes, who will continue to lead the business within Trane, emphasized the scale advantage:

“Joining Trane Technologies enables us to accelerate that mission with the resources, scale and global reach needed to power next-generation AI workloads in the most demanding compute environments.”

The acquisition builds on Trane’s minority investment in LiquidStack in 2023 and follows the company’s recently announced acquisition of Stellar Energy, reinforcing a strategy of adding specialist technologies and scaling them through Trane’s global footprint.

The signal to the industry is unmistakable: liquid cooling is no longer niche. Major HVAC incumbents now view it as core data center infrastructure.

Oracle’s Message: Cooling Must Work for Communities, Too

Cooling strategy is also becoming a community acceptance issue as AI campuses expand into new regions where water resources are already under pressure.

In a February 9 blog post, Oracle Cloud Infrastructure architect Travis Grizzel addressed the question communities increasingly ask when new data centers are proposed: “Are you going to use our water?”

Oracle’s answer is increasingly no; at least not in the way traditional evaporative cooling systems do.

In upcoming AI infrastructure deployments across New Mexico, Michigan, Texas, and Wisconsin, Oracle plans to deploy direct-to-chip, closed-loop, non-evaporative cooling systems that do not rely on continuous consumption of potable water. Instead, cooling liquid circulates in sealed systems, removing heat directly from processors before being cooled and reused.

Grizzel summarizes the concept simply:

“The heat leaves the building; cooling liquid does not.”

Oracle’s explanation is intentionally grounded in familiar analogies. Closed-loop systems work much like home air conditioners, where refrigerant circulates rather than being consumed. Direct-to-chip designs go a step further, operating more like a car radiator: liquid absorbs heat directly at the engine; in this case, processors, before being cooled and recirculated.

The practical implications are significant:

• Cooling liquid is filled once and continuously reused.
• There is no evaporation or daily water makeup requirement.
• Top-offs are rare and occur only under abnormal conditions.
• Day-to-day water use resembles typical office occupancy needs, not industrial consumption.

Industry estimates cited by Oracle note that conventional evaporative cooling systems can consume millions of gallons of water per year per megawatt of IT load, while closed-loop systems effectively eliminate ongoing water consumption for cooling operations.

For communities wary of large AI campuses straining local water supplies, that distinction is becoming critical. And for operators, this aligns with where vendors like LiquidStack and Submer have been heading: higher efficiency, reduced environmental impact, and scalable performance suited to AI-era densities.

Submer’s Transformation: From Cooling Specialist to Infrastructure Platform

While Trane’s move reflects consolidation at the equipment level, Submer’s recent actions reveal consolidation occurring at the infrastructure platform level.

Long known for liquid cooling innovation, Submer is repositioning itself as a full-stack AI infrastructure provider accountable from chip-level cooling through deployment and operations. Two February announcements illustrate how rapidly that transition is unfolding.

The first is Submer’s acquisition of Radian Arc Operations, an infrastructure-as-a-service provider deploying GPU compute platforms directly inside telecommunications carrier networks. Radian Arc already supports more than 70 telecom and edge compute customers globally, with thousands of GPUs deployed across North America, Europe, India, the Middle East, and Asia-Pacific.

These deployments support latency-sensitive services such as cloud gaming and AI workloads while enabling in-country processing that satisfies growing sovereignty requirements. By combining Radian Arc with InferX (Submer’s NVIDIA Cloud Partner platform), Submer now spans modular facility deployment, cooling integration, AI compute platforms, sovereign cloud services, and telco-embedded GPU infrastructure.

CEO Patrick Smets described the acquisition as completing Submer’s cloud infrastructure stack:

“Bringing Radian Arc together with InferX, our AI operations and delivery platform, forms a dual-plane, sovereign, telco-focused cloud offering that is highly competitive in today’s AI datacenter market.”

He underscored how far the company has moved beyond its origins:

“Built on ten years of liquid cooling leadership, Submer has evolved into a full-stack AI datacenter provider, fully accountable from chip to operation.”

Cooling vendors, in other words, are becoming cloud infrastructure providers.

India: Sovereign AI Infrastructure at Speed

Two days earlier, on February 8, Submer announced a strategic partnership with Anant Raj Cloud, a subsidiary of Anant Raj Limited, to accelerate deployment of AI-ready infrastructure across India.

The collaboration will deploy modular, liquid-cooled AI data centers alongside neocloud and inference platforms delivered through InferX, supporting sovereign and enterprise AI workloads at scale. The partnership aligns with India’s Union Budget 2026–27 focus on AI and semiconductor ecosystem development and positions Anant Raj as a foundational infrastructure partner for India’s expanding AI economy.

Submer CEO Patrick Smets described India as reaching a critical moment:

“By combining Submer’s modular datacenter infrastructure, liquid cooling technologies and prefabricated MEP systems with Anant Raj’s campus development capabilities, we bring high-performance AI compute online fast while significantly reducing environmental impact.”

Anant Raj Managing Director Amit Sarin framed the effort in national terms:

“This collaboration expands access to high-performance computing while advancing India’s AI sovereignty goals and nurturing a scalable, homegrown ecosystem.”

The move reflects a broader global shift as nations seek sovereign AI capacity rather than relying exclusively on hyperscale deployments concentrated in a handful of regions.

Infrastructure Is Becoming Integrated, and Strategic

What links these announcements is not simply liquid cooling adoption, but a structural shift in how AI infrastructure is being deployed.

AI-scale buildouts now require integrated solutions combining cooling, modular facilities, GPU infrastructure, and cloud platforms operating across centralized hyperscale campuses and distributed edge environments. Cooling decisions increasingly shape rack density, deployment timelines, site selection, and even community acceptance.

The era in which operators assembled infrastructure piece by piece from multiple vendors is giving way to one in which they increasingly seek single accountable partners capable of delivering integrated solutions.

Cooling Becomes Strategic Infrastructure

Three themes stand out.

Cooling technology is consolidating under major infrastructure players, as demonstrated by Trane’s acquisition of LiquidStack. At the same time, companies rooted in cooling expertise, like Submer, are expanding into full AI infrastructure and cloud delivery platforms. And water- and energy-efficient cooling designs, such as those Oracle is deploying, are becoming critical not only to operations but to community acceptance.

As AI factories scale and infrastructure expands into new markets, competitive advantage will increasingly belong to companies capable of delivering scalable cooling, modular infrastructure, sovereign and distributed compute, and sustainable operations acceptable to host communities.

Cooling is no longer simply about temperature control. In 2026, it is increasingly about who controls the future of AI infrastructure deployment.

 

At Data Center Frontier, we talk the industry talk and walk the industry walk. In that spirit, DCF Staff members may occasionally use AI tools to assist with content. Elements of this article were created with help from OpenAI's GPT5.

 
Keep pace with the fast-moving world of data centers and cloud computing by connecting with Data Center Frontier on LinkedIn, following us on X/Twitter and Facebook, as well as on BlueSky, and signing up for our weekly newsletters using the form below.

About the Author

Matt Vincent

A B2B technology journalist and editor with more than two decades of experience, Matt Vincent is Editor in Chief of Data Center Frontier.

Sign up for our eNewsletters
Get the latest news and updates
nVent Data Solutions
Source: nVent Data Solutions
Sponsored
Chris Hillyer, nVent's Director of Global Professional Services, explains why uptime is achieved through equipment designed to be serviced quickly when imperfection in high-density...
Schneider Electric
Image courtesy of Schneider Electric
Sponsored
Schneider Electric's Vance Peterson and Gia Wiryawan explain why power distribution and thermal management—not compute—are the bottleneck for operators when supporting NVIDIA'...