AI Data Centers Are Driving Nuclear's Next Commercial Test

From SMRs and microreactors to reactor restarts, uprates, and gas-plus-nuclear energy strategies, AI data center growth is accelerating a broad commercial push toward firm nuclear-backed power infrastructure.

Key Highlights

  • Industry players are exploring integrated compute-plus-power architectures to meet the high-density, high-availability demands of AI data centers.
  • Collaborations like NANO Nuclear with Supermicro and Terrestrial Energy with Riot demonstrate a focus on microreactors and large-scale nuclear projects for digital infrastructure.
  • Capital markets are increasingly investing in nuclear startups, with IPOs like X-energy signaling confidence in nuclear's role in AI power solutions.
  • Bridge-power strategies combining natural gas and nuclear aim to accelerate deployment timelines while managing regulatory and community concerns.
  • Regulatory frameworks are evolving to support modular and repeatable nuclear deployment, but operational, political, and logistical challenges remain.

Nuclear power is rapidly moving from speculative talking point to active AI infrastructure strategy. Over the past several weeks, and especially in the last ten days, the industry has seen a wave of announcements pointing in the same direction: data center developers, reactor startups, utilities, server manufacturers, industrial suppliers, and large power producers are all attempting to define what nuclear-backed digital infrastructure might actually look like.

The May 6 memorandum of understanding between NANO Nuclear Energy and Supermicro remains one of the clearest signals. The agreement links a microreactor developer with a major AI server and infrastructure supplier, framing nuclear power as part of an integrated compute, cooling, and power stack. But the deal now looks less like an isolated experiment and more like part of a rapidly expanding category.

That same day, Terrestrial Energy and Riot Platforms announced a collaboration to evaluate nuclear-powered large-scale data center projects using Terrestrial’s Integral Molten Salt Reactor technology. A day earlier, Blue Energy and GE Vernova unveiled a 2.5 GW gas-plus-nuclear strategy centered on GE Vernova Hitachi’s BWRX-300 small modular reactor.

Also in this timespan, Curtiss-Wright advanced X-energy’s Xe-100 program from design into prototype manufacturing. Constellation told investors it has 5 GW of nuclear uprates, gas generation, and battery storage in the PJM queue as prospective data center customers await regulatory clarity. And Oklo and Idaho National Laboratory announced a project applying AI-enabled engineering workflows to advanced reactor development.

Taken together, the announcements reveal a market trying to solve the same problem from multiple directions: AI data centers need firm, large-scale, predictable power faster than the traditional grid-planning process can reliably deliver it.

Changing the Model: NANO Nuclear and Supermicro

NANO Nuclear’s non-binding MOU with Supermicro centers on integrating NANO’s advanced microreactor systems with Supermicro’s AI server and data center platforms. The companies said they will explore dedicated on-site nuclear power for data centers, integration of Supermicro AI racks and cooling systems with nuclear-backed infrastructure, joint go-to-market strategies for hyperscale and enterprise customers, and a self-powered, grid-independent AI infrastructure model.

The most important word remains “explore.” This is not a power purchase agreement, a financing commitment, a construction start, or an NRC license. It is a non-binding MOU. But in the current AI infrastructure market, even exploratory agreements can carry strategic significance because they directly address the industry’s defining constraint: power availability.

The commercial logic is straightforward. Supermicro increasingly wants to sell integrated AI infrastructure rather than servers alone, spanning rack-scale systems, liquid cooling, storage, networking, and modular data center building blocks. NANO, meanwhile, is positioning its KRONOS MMR system as a modular energy platform for high-density digital infrastructure. The MOU envisions compute and power being designed together from the outset rather than integrated later as separate systems.

That convergence matters because AI infrastructure is beginning to collapse the traditional separation between IT and facilities. A modern GPU cluster is no longer simply an equipment deployment. It is simultaneously an electrical, thermal, mechanical, real estate, and regulatory project. The central question for NANO and Supermicro is whether integrated compute-plus-power architectures can evolve from compelling concept into something licensable, financeable, insurable, and operationally viable at real-world scale.

Terrestrial Energy and Riot: Nuclear Meets the Power-First Data Center Developer

The Terrestrial Energy–Riot Platforms announcement pushes the nuclear-data center discussion beyond microreactors and into large-scale campus planning. The companies signed an MOU to evaluate deployment of Terrestrial’s IMSR Plants alongside Riot-developed AI and high-performance computing data centers.

Proposed configurations could include multiple 390 MW IMSR units and as much as 4 GW of nuclear capacity across candidate sites in Texas and Kentucky.

The Riot partnership is notable because the company increasingly represents a new category of power-first digital infrastructure developer. Best known for bitcoin mining, Riot has spent years building around access to power, substations, and large-scale electrical load management; capabilities that translate naturally into AI and HPC infrastructure as the market shifts toward power-constrained development models.

Terrestrial’s IMSR design also broadens the reactor conversation. Unlike the smaller microreactor framing around NANO Nuclear, Terrestrial is discussing modular nuclear plants measured in hundreds of megawatts per unit. The company also emphasized hybrid energy configurations that could use natural gas as a bridge fuel to support early operations and resilience while nuclear projects move through licensing and construction.

That bridge-power strategy directly addresses one of the central tensions in nuclear-backed AI infrastructure: AI deployment cycles move much faster than nuclear development timelines. GPUs, cloud leases, and hyperscale capacity commitments operate on timelines measured in quarters, while advanced reactor projects may take years to license and build. A campus that can energize early with gas and later transition to nuclear may prove commercially more realistic than one forced to wait for reactor completion before bringing compute online.

X-energy’s IPO Shows Capital Markets Buying the Nuclear-AI Thesis

X-energy’s late-April IPO adds another critical dimension to the nuclear-data center story: capital formation. According to TechCrunch, the company raised approximately $1 billion in its public offering, selling 44.3 million shares at $23 each, above its expected range. The offering suggests investors are increasingly willing to fund advanced reactor developers positioned around industrial decarbonization and long-term AI infrastructure demand.

X-energy is developing the Xe-100, an 80 MW helium-cooled high-temperature gas reactor using TRISO fuel pebbles. But the company’s commercial narrative extends beyond reactor technology alone. X-energy has tied its growth strategy directly to industrial heat, utility-scale power, and hyperscale data center demand — positioning itself as part of the broader buildout of firm energy infrastructure for AI-era computing.

Amazon is central to that narrative. X-energy is linked to a framework targeting as much as 5 GW of nuclear deployment by 2039, while Amazon’s Climate Pledge Fund has previously invested in the company. X-energy is also advancing its first Xe-100 deployment at Dow’s Seadrift industrial site in Texas. Together, those projects give the company one of the more commercially mature development stories among advanced reactor startups.

The May 6 Curtiss-Wright announcement further reinforced that the Xe-100 program is moving deeper into execution. Curtiss-Wright said it has transitioned key Xe-100 components — including helium circulator systems and reactivity control and shutdown systems — from design into prototype manufacturing. The company also explicitly linked the reactor platform’s efficiency and resilience to the growing requirements of energy-intensive AI data centers. Curtiss-Wright additionally referenced X-energy’s plans with Energy Northwest for up to 12 Xe-100 units, totaling 960 MW, under Amazon’s broader 5 GW deployment framework.

For the data center industry, the significance goes beyond reactor design. Capital access, manufacturing readiness, supply-chain depth, licensing progress, fuel strategy, and committed commercial partners are increasingly becoming the real indicators of viability. The market is no longer evaluating advanced nuclear solely as a technology concept. It is beginning to evaluate whether these companies can actually deliver infrastructure at industrial scale.

Constellation and Crane: The Existing Nuclear Fleet Remains the Near-Term Prize

Advanced reactors may dominate the headlines, but the most commercially relevant nuclear-data center story in the near term may still be the existing U.S. nuclear fleet. Constellation’s plan to restart the Crane Clean Energy Center — formerly Three Mile Island Unit 1 — is a prime example. The restart is backed by a 20-year agreement with Microsoft, while Constellation executives told investors the project now hinges heavily on regulatory decisions that will determine how and when the plant can inject full output into the grid.

Constellation is reportedly seeking approval to transfer grid injection rights from its Eddystone gas plant to Crane, with decisions potentially arriving as early as June or July. Utility Dive also reported that Constellation still aims to restart Crane before 2031 despite uncertainty surrounding full injection rights, while some prospective data center customers in PJM are delaying decisions pending greater clarity around colocation rules and backstop-auction policies.

That regulatory layer is critical because the issue extends well beyond whether nuclear generation exists. The real question is whether power can interconnect, receive capacity credit, satisfy market rules, and support bankable long-term data center contracts. In PJM, those questions now intersect with broader debates over colocated load, transmission cost allocation, grid reliability, and whether hyperscale data center customers should shoulder a larger share of new infrastructure costs.

Constellation’s broader 5 GW PJM queue position also illustrates how incumbent power producers are approaching AI load growth pragmatically rather than ideologically. The company’s pipeline includes nuclear uprates, gas-fired generation, and battery storage rather than a single-technology strategy. That portfolio approach reflects the emerging reality of AI infrastructure development: near-term power capacity will likely be assembled from multiple generation sources, with nuclear serving as a clean firm anchor where available rather than a standalone solution.

The Gas-Plus-Nuclear Bridge

Blue Energy and GE Vernova’s May 5 announcement makes the bridge-power strategy explicit. The companies outlined a 2.5 GW collaboration combining natural gas and nuclear generation around GE Vernova Hitachi’s BWRX-300 small modular reactor at Blue Energy’s first planned Texas site, pending a final investment decision in 2027. The companies also signed a slot reservation agreement for two GE Vernova 7HA.02 gas turbines targeted for 2029 delivery to support early site energization.

The underlying logic is straightforward: gas turbines can deliver power quickly while the nuclear portion advances through licensing, construction, and commissioning. Blue Energy argues the model could compress conventional nuclear deployment timelines by bringing campuses online with gas first and transitioning later to nuclear-backed generation. GE Vernova framed the approach as a way to accelerate time-to-power for rapidly expanding AI infrastructure development.

For data center operators, the model aligns closely with how AI campuses are actually leased and deployed. Developers often need an initial tranche of capacity to satisfy anchor tenants, followed by incremental power additions over time. A gas-plus-nuclear architecture offers a potential pathway from near-term energization to longer-term clean firm capacity.

But the model also introduces important commercial and regulatory questions. How much natural gas ultimately remains in the long-term operating profile? Who assumes the carbon-accounting risk during the transition period? And what contractual or regulatory mechanisms ensure the bridge to nuclear does not simply become a longer-term dependence on gas generation?

Those questions are likely to shape many of the next-generation AI power projects now emerging. Developers may market nuclear as the destination, but investors, regulators, utilities, and hyperscale customers will increasingly scrutinize the bridge itself.

Microreactors, Demonstrations, and the Symbolic Power of First Electrons

Not every nuclear-data center initiative is aimed immediately at gigawatt-scale AI campuses. Some of the industry’s most visible recent announcements are smaller demonstrations intended to validate pieces of the broader architecture.

In late April, Elemental Nuclear and the University of Utah announced plans to use the university’s TRIGA research reactor to generate electricity for the first time in its 50-year history and power a miniature AI data center. The target output is only 2 to 3 kW — nowhere near commercial data center scale — but the project is designed to demonstrate that heat from a small reactor can be converted into electricity for a live GPU workload using a compact Brayton Cycle power system.

The demonstration does not solve the larger challenges surrounding commercial nuclear deployment, including siting, licensing, financing, fuel supply, security, redundancy, and operations. What it does provide is something more symbolic but still important: a visible proof point that fission-powered AI compute can move from theoretical discussion into functioning hardware.

The NANO Nuclear–Supermicro MOU occupies similar conceptual territory, though at a more infrastructure-oriented scale. The implied long-term vision is a modular architecture in which compute, cooling, and power generation are designed as an integrated system rather than separate layers of infrastructure. The challenge, as always, will be transforming symbolic integration into repeatable, financeable, and operationally standardized deployment models.

AI for Nuclear, Nuclear for AI

Oklo’s May 12 announcement with Idaho National Laboratory adds a reciprocal twist to the AI-energy narrative: AI is not only increasing demand for nuclear power, it may also become part of the nuclear development process itself. The collaboration will apply INL’s Prometheus AI platform to accelerate advanced reactor and fuel-system engineering workflows, including work related to Oklo’s Pluto reactor.

The initiative spans AI-assisted modeling, simulation, technical documentation, benchmarking, validation, and engineering workflows capable of interacting with complex multiphysics design environments while keeping human operators in the loop. For an industry defined by long development cycles and extensive regulatory documentation, even modest gains in engineering efficiency could prove meaningful.

The relevance to data centers is indirect but significant. If AI tools can shorten reactor design cycles, reduce engineering bottlenecks, or streamline documentation and validation processes, they could help narrow one of the industry’s central tensions: the widening gap between the rapid deployment cadence of AI infrastructure and the far slower timelines associated with advanced nuclear development.

Why Data Centers Are Moving Nuclear Forward

The common driver behind these announcements is not ideology. It is load growth. AI data centers require high-capacity, high-availability power that can be contracted years in advance, scaled alongside campus expansion, and aligned with long-term decarbonization goals. Traditional renewables remain essential but do not fully solve the problem of 24/7 firm capacity. Natural gas can be deployed quickly and flexibly but carries fuel-price volatility and emissions exposure. Meanwhile, grid power is increasingly constrained by interconnection delays, transmission bottlenecks, local opposition, and growing concern over ratepayer impacts.

Against that backdrop, nuclear presents an increasingly attractive theoretical fit: firm, high-capacity-factor generation with low operating emissions. Existing nuclear plants can support large-scale corporate power agreements today. Uprates can add incremental clean capacity. Restart projects can return retired assets to service. SMRs and microreactors, meanwhile, promise a more modular future potentially better aligned with campus-scale AI infrastructure growth.

But the distance between concept and deployment remains substantial. Advanced reactor projects still face major hurdles spanning licensing, financing, manufacturing, fuel supply, construction, insurance, operations, and community acceptance. For data center developers, attractive physics alone is not enough. They need bankable schedules, predictable delivery, and operational certainty. A delayed reactor does not power GPUs, and a delayed GPU campus can easily miss the market window that justified the investment in the first place.

Regulation, Interconnection, and Community Risk

The U.S. regulatory framework is beginning to evolve in ways that could materially affect advanced reactor deployment. The NRC’s proposed Part 53 framework is intended to create a more risk-informed and technology-inclusive licensing pathway for advanced reactors, while emerging microreactor-specific approaches could eventually support more modular and repeatable deployment models.

For data center applications, however, the critical questions are less about theoretical licensing pathways than operational practicality. Can regulators support repeatable reactor approvals without weakening safety confidence? Can project structures clearly define ownership, liability, and operational responsibility? And can advanced nuclear projects move at a pace compatible with AI infrastructure deployment cycles?

That ownership issue is especially important because a data center operator is not a nuclear utility. Even highly automated or factory-built reactors are unlikely to be treated by regulators or communities like backup generators or battery systems. The most plausible deployment models will likely involve reactor developers, utilities, independent power producers, or specialized nuclear operators owning and operating the nuclear asset while data center companies purchase power, capacity, or energy services through long-term agreements.

Community politics may prove equally challenging. Data centers already face mounting opposition around land use, water consumption, transmission infrastructure, backup generation, tax incentives, noise, and electric-rate impacts. Introducing an on-site or adjacent nuclear facility fundamentally changes the political dynamics of those projects. Even where technical risk is low, perceived risk may remain high. Developers will need sophisticated public-engagement and community-trust strategies alongside the engineering and financing models required to build the projects themselves.

The Search for a Deployable Nuclear Model

The NANO Nuclear–Supermicro MOU is not yet a deployment story. Neither is the Terrestrial–Riot collaboration. X-energy’s IPO does not guarantee reactor delivery. Constellation’s Crane restart still depends on regulatory and interconnection decisions. Blue Energy’s gas-plus-nuclear strategy still has to prove that its bridge to nuclear becomes something more than an extended reliance on gas. Elemental Nuclear’s University of Utah demonstration remains symbolic rather than commercial. Oklo’s AI-enabled engineering initiative is a development tool, not a power contract.

Yet the broader pattern is becoming difficult to ignore. Nuclear power has moved from the margins of the data center conversation toward the center of long-term AI infrastructure planning. The industry is no longer asking only whether nuclear might someday support data centers. It is actively testing which models can realistically deliver power at the speed, scale, reliability, and commercial structure that AI infrastructure now demands.

That competition is unfolding across multiple fronts simultaneously: existing nuclear PPAs, reactor restarts, uprates, SMR-backed energy campuses, microreactor demonstrations, gas-plus-nuclear bridge architectures, and integrated compute-and-power development models.

The central question is no longer whether nuclear belongs in the future of AI infrastructure. The question is whether the nuclear industry can evolve quickly enough to match the deployment cadence of the AI buildout now underway.

 

At Data Center Frontier, we talk the industry talk and walk the industry walk. In that spirit, DCF Staff members may occasionally use AI tools to assist with content. Elements of this article were created with help from OpenAI's GPT5.

Keep pace with the fast-moving world of data centers and cloud computing by connecting with Data Center Frontier on LinkedIn, following us on X/Twitter and Facebook, as well as on BlueSky, and signing up for our weekly newsletters using the form below.
Contributors:

About the Author

David Chernicoff

David Chernicoff

David Chernicoff is an experienced technologist and editorial content creator with the ability to see the connections between technology and business while figuring out how to get the most from both and to explain the needs of business to IT and IT to business.
Sign up for our eNewsletters
Get the latest news and updates
Stream Data Centers
Source: Stream Data Centers
Sponsored
Stream Data Centers' Eric Closson explains the importance of grounding data center design and development in reality.
AdobeStock, courtesy of Schneider Electric
Source: AdobeStock, courtesy of Schneider Electric
Sponsored
Schneider Electric's Carsten Baumann explains why the shift to AI factories demands a fundamental rethinking of power architecture, digital design, and energy intelligence.