Execution, Power, and Public Trust: Rich Miller on 2026’s Data Center Reality and Why He Built Data Center Richness
DCF founder Rich Miller has spent much of his career explaining how the data center industry works. Now, with his latest venture, Data Center Richness, he’s also examining how the industry learns.
That thread provided the opening for the latest episode of The DCF Show Podcast, where Miller joined present Data Center Frontier Editor in Chief Matt Vincent and Senior Editor David Chernicoff for a wide-ranging discussion that ultimately landed on a simple conclusion: after two years of unprecedented AI-driven announcements, 2026 will be the year reality asserts itself.
Projects will either get built, or they won’t. Power will either materialize, or it won’t. Communities will either accept data center expansion - or they’ll stop it.
In other words, the industry is entering its execution phase.
Why Data Center Richness Matters Now
Miller launched Data Center Richness as both a podcast and a Substack publication, an effort to experiment with formats and better understand how professionals now consume industry information.
Podcasts have become a primary way many practitioners follow the business, while YouTube’s discovery advantages increasingly make video versions essential. At the same time, Miller remains committed to written analysis, using Substack as a venue for deeper dives and format experimentation.
One example is his weekly newsletter distilling key industry developments into just a handful of essential links rather than overwhelming readers with volume. The approach reflects a broader recognition: the pace of change has accelerated so much that clarity matters more than quantity.
The topic of how people learn about data centers isn’t separate from the industry’s trajectory; it’s becoming part of it. Public perception, regulatory scrutiny, and investor expectations are now shaped by how stories are told as much as by how facilities are built.
That context sets the stage for the conversation’s core theme.
Execution Defines 2026
After a year of headline-making announcements for massive AI and cloud campuses, Miller argues that the coming year will separate viable projects from aspirational ones.
“If 2025 was a year in which many enormous developments were announced and planned, 2026 is the year in which we find out how many of them will actually be delivered.”
The gating factor, increasingly, is power.
Demand for electricity from hyperscale and AI projects far exceeds what utilities can currently deliver in many regions. Interconnection queues are growing, transmission upgrades take years, and utilities move on timelines that do not match AI deployment cycles measured in quarters.
That mismatch is forcing developers to rethink how power is delivered.
On-Site Power Moves From Stopgap to Standard
One of the most notable shifts Miller highlighted is the industry’s changing view of on-site generation.
Historically treated as temporary bridging capacity while waiting for grid connections, on-site power is now becoming part of baseline project planning. Natural gas turbines, modular generation systems, and even aero-derivative engines drawn from other industries are being deployed to close the gap between immediate demand and utility timelines.
The logic is straightforward: developers can no longer assume grid power will arrive on schedule, and communities are increasingly sensitive to projects perceived as straining local infrastructure.
On-site power also opens the possibility of providing resilience benefits to surrounding communities during outages or extreme weather events; an angle that could reshape public perception in regions skeptical of data center growth.
Community Pushback Becomes a Business Constraint
Perhaps the most consequential portion of the discussion centered on community resistance, which Miller believes the industry underestimated for years.
What once looked like isolated NIMBY disputes has evolved into organized opposition capable of halting or delaying projects worth billions of dollars. Public meetings have overflowed, elected officials have been voted out of office, and development pauses are being considered in several states.
Miller argues the industry can no longer treat community relations as a secondary issue.
“It’s not just a public relations issue—it’s a business issue right now.”
Water usage and electricity pricing have emerged as particular flashpoints, often fueled by incomplete or misleading narratives. Newer data center cooling technologies, including closed-loop systems that significantly reduce water consumption, are frequently overlooked in public debates.
At the same time, mainstream media coverage has shifted, often emphasizing controversy rather than economic benefits.
Miller believes operators must become more proactive in shaping public understanding rather than retreating from coverage.
Recent moves by Microsoft illustrate this shift. Faced with resistance to one project, the company publicly identified itself, paused development, and committed to working directly with local stakeholders, including assurances that the project would not increase electricity costs for residents.
That kind of engagement may become a requirement rather than an exception.
Nvidia’s Roadmap Raises the Infrastructure Bar
While social and regulatory pressures shape project viability, technology requirements continue to escalate at a rapid pace.
Miller outlined three Nvidia-driven trends that are redefining infrastructure planning for AI factories.
First, rack densities are climbing rapidly. By 2027, Nvidia’s Vera Rubin Ultra systems could drive power densities toward 600 kW per rack, well beyond today’s leading-edge deployments. That trajectory makes direct liquid cooling unavoidable and pushes operators to consider even more advanced cooling techniques.
Second, Nvidia and partners in the Open Compute ecosystem are accelerating the move toward 800-volt DC distribution, delivering much higher power levels directly to racks and chassis. Major infrastructure suppliers are aligning around this transition, which represents a fundamental shift in data center electrical design.
Third, Nvidia’s Omniverse DSX blueprint introduces digital twin concepts into data center planning, allowing operators to simulate hardware upgrades and infrastructure impacts before deployment. The goal is a future where new hardware generations can be integrated with minimal disruption.
Together, these developments raise the bar for facilities seeking to compete in the AI factory segment.
A Stratified Market Emerges
Not every operator will, or should, pursue the AI factory model.
Miller expects the market to stratify, with some developers specializing in ultra-high-density AI facilities while others focus on cloud, colocation, or enterprise-oriented environments.
Interestingly, he suggests enterprise demand may currently be underserved, as capital and attention flow disproportionately toward massive single-tenant AI projects. Many enterprise workloads will continue to rely on cloud, neo-cloud, or third-party providers rather than on-premises deployments, creating opportunities outside the hyperscale arms race.
At the same time, inference workloads may extend the economic life of existing GPU fleets, reshaping investment calculations around hardware turnover.
Nuclear Power: Promise Later, Not Now
On nuclear energy, Miller sees data centers positioning themselves as anchor customers for future small modular reactors. However, meaningful contributions from SMRs remain several years away.
Existing nuclear facilities may see renewed interest through power purchase agreements that extend plant lifetimes, but new deployments face regulatory and public acceptance hurdles. In the meantime, operators must rely on storage, curtailment strategies, and incremental grid improvements to reclaim capacity.
The Only Constant: Speed
If one theme cut across every topic in the conversation, it was the accelerating pace of change.
Miller pointed out that the industry’s current transformation traces back only three years, to the public debut of ChatGPT. Since then, infrastructure investment has surged, AI demand has exploded, and operators remain capacity constrained despite record spending.
“Everything that’s happened since then—that’s barely three years… Things are moving much quicker than they ever have in the data center industry.”
That reality leaves 2026 as a proving ground.
Projects must be delivered. Power must be secured. Communities must be convinced. Infrastructure must evolve.
Execution, not ambition, will determine the next phase of growth.
And for an industry that rarely slows down, Miller’s final advice feels appropriate:
“Everybody should strap on their seatbelts and get ready for the ride.”
At Data Center Frontier, we talk the industry talk and walk the industry walk. In that spirit, DCF Staff members may occasionally use AI tools to assist with content. Elements of this article were created with help from OpenAI's GPT5.
Keep pace with the fast-moving world of data centers and cloud computing by connecting with Data Center Frontier on LinkedIn, following us on X/Twitter and Facebook, as well as on BlueSky, and signing up for our weekly newsletters using the form below.
About the Author
Matt Vincent
A B2B technology journalist and editor with more than two decades of experience, Matt Vincent is Editor in Chief of Data Center Frontier.



