In the latest episode of the Data Center Frontier Show podcast, Editor-in-Chief Matt Vincent sits down with Jay Dietrich, Research Director of Sustainability at Uptime Institute, to examine what real sustainability looks like inside the data center, and why popular narratives around net zero, offsets, and carbon neutrality often obscure more than they reveal.
Over the course of our conversation, Dietrich walks listeners through Uptime’s expanding role in guiding data center operators toward measurable sustainability outcomes; not just certifications, but operational performance improvements at the facility level.
“Window Dressing” vs. Real Progress
Dietrich is candid about the challenges operators face in navigating the current landscape of sustainability reporting. Despite high-level claims of carbon neutrality, many facilities still operate inefficiently, relying heavily on carbon offsets or energy attribute certificates to hit corporate goals.
“An EU survey found that 80% of data centers report carbon-free operations based on market calculations, while their national grids run at only 55% renewable,” Dietrich says. “The only thing that truly matters is the performance of the actual facility.”
To close this gap, Uptime offers a Sustainability Gap Analysis and a Sustainable Operations Certification, helping data center operators minimize energy and water use, improve cooling efficiency, and increase the useful work delivered per megawatt hour.
Redefining the Sustainable Data Center
One of the discussion’s core messages: a net zero data center is not necessarily a sustainable one. Dietrich stresses the need to shift focus from corporate carbon accounting toward IT utilization, emphasizing metrics like:
- Work delivered per unit of energy consumed.
- Work delivered per metric ton of CO₂ emitted (location-based).
- Actual IT infrastructure utilization rates.
Underutilized IT infrastructure — still common across the industry — is one of the biggest sustainability blind spots.
“Running IT at 10% utilization wastes capacity, space, and energy,” says Dietrich. “Increasing that to 50% can cut equipment needs by two-thirds for the same workload.”
The Carbon-Free Energy Challenge
The road to 100% carbon-free energy (CFE) remains long and complex. Dietrich breaks down the limits of wind and solar, citing the need for dispatchable baseload power during low-generation periods, typically covered today by natural gas or diesel gensets.
He also highlights data center–driven innovation in energy procurement, from fixed-price PPAs that support nuclear generation (e.g., Constellation’s agreements for Three Mile Island and Susquehanna) to emerging microgrid strategies and co-location with renewable assets.
Still, systemic constraints remain. “Even with aggressive investment, we’re more than a decade away from achieving 100% CFE,” he cautions.
AI, Efficiency, and the Coming Wave of Demand
AI’s rapid growth is top-of-mind across the industry, and Dietrich expects its energy footprint to expand in parallel with traditional compute, with around 650 TWh of growth between 2023 and 2030, evenly split between standard compute and AI.
However, he sees opportunity. “AI developers will need to optimize for work per energy just like the broader industry has,” he says. "Competitiveness will depend on it."
Rethinking Scope 3 Emissions and Supplier Accountability
Dietrich also calls for a reexamination of Scope 3 emissions accounting, which often involves duplicative or overly generalized methods. Rather than funneling resources into exhaustive accounting, he advocates for direct engagement with suppliers, especially in:
- Steel manufacturing (for low-carbon alternatives).
- Semiconductor production, which accounts for 60% of embedded carbon in IT gear.
“These chemicals have massive global warming potential,” says Dietrich. “We need suppliers to demonstrate destruction of these gases, not just offset them.”
Labeling Risks and Regulatory Frontiers
In closing, the conversation turns to regulatory developments in the EU and elsewhere, where data center sustainability labels are gaining traction. While these could create standardization, Dietrich warns that public disclosure of operating data raises confidentiality concerns for operators.
Uptime Institute is actively engaging with regulators to balance transparency with realistic progress, encouraging labeling systems that avoid penalizing operators for issues beyond their control.