Reshaping Energy Supply for the Data Center Value Chain

Peter Huang, Global President - Data Center & Thermal Management at bp Castrol, explains why AI isn't just consuming more power, it's demanding better power systems.
Dec. 24, 2025
4 min read

Artificial intelligence is reshaping data centers in ways the industry has not seen before. The shift is not simply about deploying more GPUs or building larger facilities. It is about confronting a new reality: power has become the defining constraint of AI infrastructure.

This transformation is a structural one. In the AI era, supplying power is no longer a background utility function. It is a system-level challenge that spans the entire data center value chain—from how energy is generated and delivered, to how it is stabilized, cooled, and sustained over time.

When Rack Power Jumps, Everything Else Follows

For years, data centers evolved within a relatively predictable power envelope. Rack densities of 20–30 kW shaped the design of power distribution, cooling systems, and operational models.

AI has broken that envelope.

Today, AI clusters are routinely designed for 100 kW or more per rack, with leading-edge deployments already testing 150 to 220 kW. At these levels, power is no longer abstract. Every fluctuation matters. Every inefficiency compounds. And traditional AC-based architectures begin to show their limits.

This is where energy storage quietly changes roles. Instead of serving purely as backup, storage becomes a stability layer—absorbing load spikes, smoothing variability, and enabling AI workloads to run continuously even as power sources diversify.

Why High-Voltage DC Is Gaining Momentum

As rack densities climb, attention naturally shifts upstream to power architecture. One approach gaining traction is high-voltage 800V DC distribution, inspired by open compute initiatives such as ORv3 and Panama-style power designs.

The appeal is practical rather than theoretical. Fewer power conversions mean lower losses. Higher voltage means lower current and simpler distribution at scale. Just as importantly, DC architectures align more naturally with renewable generation and energy storage, both of which are becoming integral to new AI data center builds.

This is not a wholesale rejection of existing infrastructure. It is a recognition that AI workloads change the math, and power systems must evolve accordingly.

Energy Supply Is No Longer a Single Decision

One of the most persistent misconceptions in AI data center planning is treating energy supply as a single procurement choice. In reality, it has become a system design exercise.

Modern AI facilities must coordinate:

  • Multiple energy sources, renewable and conventional
  • Storage systems that respond at machine timescales
  • Distribution architectures that scale safely with density
  • Operational strategies that balance uptime, efficiency, and compliance

Viewed this way, energy supply becomes the backbone of the AI data center value chain, shaping decisions far beyond the electrical room—from site selection to cooling strategy to long-term operating cost.

From Design Intent to Operational Proof

As power density rises, so does the cost of uncertainty. Designs that look robust on paper must perform under real thermal and electrical stress before AI hardware is deployed.

Validating power and liquid cooling systems under realistic load conditions has become a critical step in bringing AI data centers online on schedule. Without this validation, operators risk discovering system limitations only after production workloads are live—when margins for error are minimal.

This is where Castrol’s Data Centre & IT Cooling solutions come into play. By supporting system-level testing and validation—replicating rack-level thermal loads, flow behavior, and pressure dynamics—operators gain confidence that power and cooling systems will behave as expected under AI-scale conditions.

The result is not just better performance, but greater predictability across deployment and long-term operations.

The Bigger Picture

AI does not merely consume more power. It demands better power systems—systems that are resilient, efficient, and adaptable as technology and energy markets continue to evolve.

My perspective reflects a broader industry shift. Energy supply is moving from a supporting role to a central design principle of AI data centers. Those who treat it as an integrated, end-to-end system—rather than a collection of components—will be better positioned to scale AI infrastructure sustainably and reliably.

In the AI era, power is no longer just what keeps the lights on. It is what keeps intelligence running.

About the Author

Peter Huang

Peter Huang

Peter Huang is Global President of Thermal Management & Data Centre at bp Castrol, leading the company’s global thermal management business for data centers, energy storage, and beyond. Guided by Castrol’s ethos “Onward, Upward, Forward,” he spearheads the company’s “Forward” vision to become the leading liquid cooling orchestrator and partner-of-choice for global partners and customers.

Under his leadership, Castrol’s Dipping Point research underscored the rising importance of liquid cooling to meet the growing demands of AI, ML, and edge computing. Peter’s experience spans over 15 years of international jobs in the Energy and IT industries. He holds degrees from the University of Illinois Urbana-Champaign, Harvard University, and National Chengchi University.

Follow bp Castrol on LinkedIn: Castrol ON Liquid Cooling.

Sign up for our eNewsletters
Get the latest news and updates