The Eight Themes That Will Shape the Data Center Industry in 2024

Jan. 2, 2024
The AI boom will ripple through the digital infrastructure sector, impacting the availability of space, the supply chain, pricing, cooling, power and design.

Artificial intelligence is hot. So hot that the AI boom is creating a resource-constrained world, driving stupendous demand for GPUs, data centers and AI expertise. All three are likely to be in short supply, but none so much as wholesale data center space.

This is the trend that dominates our annual forecast. Each January we identify eight themes that will shape the data center business. Our crystal ball has done pretty well in recent years, so it’s time to look ahead to 2024.

Related:  Scorecard: Looking Back at Data Center Frontier’s 2023 Industry Predictions

1.  The AI Boom Creates a Data Center Space Crunch

The days of prospects touring empty data halls are over. Data center developers are effectively out of wholesale data center space in many major markets. Hyperscale users have gobbled up most of the capacity in development.

“Much of the anticipated 2024 supply will be pre-leased, resulting in limited options for users who are not in the market far in advance of their preferred go-live date,” JLL reported.

The AI land grab arrives amid a supply-constrained market. The average vacancy rate across North American data center markets stood at 2.7% in the third quarter of 2023, according to datacenterHawk. In some markets, like Northern Virginia, the vacancy rate is about 1 percent.   

The challenging environment for data center capacity is beginning to be felt. Oracle’s cloud revenue growth was slowed by the pace of its cloud deployments. 

“Frankly, the only limiting factor is our ability to get the data centers handed over and filled up fast enough,” said Oracle CEO Safra Catz in the company’s earnings call. “This quarter alone, we're talking about hundreds of millions of dollars that we would have been able to recognize if our capacity was available.”

Oracle is slightly newer to the hyperscale game than its rival cloud platforms, who have been working far in advance to lock down data center capacity. Cloud builders are now leasing entire campuses, as seen in the recent pre-lease of 5 buildings and 430 megawatts in Northern Virginia in a single transaction.

With these pre-leases, cloud platforms have secured capacity to meet short-term demand of their AI infrastructure. The key questions: how much space remains for everyone else, and how long can developers keep pace with AI-powered growth?   

Schneider Electric estimates that AI power consumption will grow at a rate of 25% to 33% annually through 2028, when it could reach as much as 18.7 gigawatts.

In recent months OpenAI, Microsoft and other AI companies have begun rolling out tools for  users to deploy custom chatbots and GPT applications. OpenAI has laid the groundwork for an app store that could create a supercharged developer ecosystem. 

Where will enterprises run those AI workloads? The data center shortage may be a particular problem for enterprise users, as we’ve previously noted. Colocation specialists may be able to help with smaller requirements. But if enterprises have difficulty procuring data center space, their best option for AI capacity will be the very cloud platforms that pre-leased all that space. That could boost the already strong demand for cloud services.   

That’s Oracle’s expectation. “We expect (Oracle Cloud) to just grow astronomically, frankly,” said Catz. “And of course, also as more GPUs become available and we can put those in, we have just a really unlimited amount of demand.”

As demand soars, the data center space crunch will create ripples through almost all elements of digital infrastructure, impacting the supply chain, pricing, cooling, design and power infrastructure. The rest of our predictions unpack what this may look like.

The good news: The digital infrastructure sector now has the maturity, capital support and history of innovation to execute against all these challenges.

Related:  Reports: “Tsunami” of AI Demand Drives Record Data Center Leasing

2.  Rethinking Power on Every Level 

Making the Internet available everywhere, all the time requires an enormous amount of infrastructure and power. Data centers have become the front lines in our society’s digital transition, creating tension with the limitations of the power grid.   

Utilities are struggling to upgrade transmission networks to support the surging requirement for electricity to power data centers. CBRE says data center construction completion timelines have been extended by 24 to 72 months due to power supply delays.

“This is our biggest challenge over the next 3-4 years, and a limiting factor on how fast we can continue to grow,” Digital Bridge CEO Marc Ganzi told Bloomberg Markets. “If you’re not building data centers adjacent to low-cost renewable power, you’re not going to have a fruitful conversation with the hyperscalers and cloud players. It’s an absolute must if we want to continue to do repeat business with our customers." 

“We can’t rely on just the grid anymore,” said Ganzi. “We’ve got to be able to take things into our own hands, protect our supply chain, and deliver for our customers.”

Although the constraints in Northern Virginia have made headlines, power availability has quickly become a global challenge, impacting major markets in Europe and Asia as well as U.S. hubs like Ashburn, Santa Clara, and sections of Dallas and Suburban Chicago. 

Last year we predicted the rise of on-site power generation, but we’ve yet to truly see this at scale. But data center operators are working on a range of new approaches to power. These include the following:

  • Microsoft is hiring a team to deploy small modular reactors (SMR) and microreactors to support its cloud. It is also working with Terra Praxis on a generative AI model to streamline nuclear regulatory licensing and applications.
  • AWS (Oregon) and Microsoft (Dublin) are using natural gas generation to support new multi-facility campuses.
  • Google and Fervo Energy have begun operating a novel geothermal power system at its campus in Nevada, using fiber-optic sensors to tap the earth’s heat.
  • Several data center projects in Kenya will use geothermal energy from an energy park. 
  • Startup ECL has announced its first modular data center using hydrogen fuel cells, a technology that is also being evaluated by Microsoft and Equinix. 
  • Fuel cell provider Enchanted Rock will buy renewable natural gas (RNG) made from food waste to help power a microgrid that provides resilience at a Microsoft data center in San Jose.

Expect to see innovations in power continue as data centers seek better visibility into their power sourcing.       

Related: Dominion: Virginia’s Data Center Cluster Could Double in Size

3.  Pricing for AI Capacity Will Continue Higher

Supply constraints almost always show up in pricing, and that will continue to be true in the data center market in 2024, especially if tenants compete to secure limited capacity.

"As AI demand continues to grow, it significantly impacts the data center industry's pricing structure,” notes datacenterHawk’s Rhett Gill in a recent analysis. “Rates have increased considerably, driven by new companies willing to pay higher prices per KW per month to secure power and de-risk their investments."

CBRE says pricing for 250 kW to 500 kW requirements rose 16 percent in 2023 and will rise another 10% to 15% in 2024.

“This is due to supply constraints and continued strong demand across all markets,” CBRE said in its 2024 Outlook. “Operators may be willing to lower lease pricing for legacy assets with vacancy, but this discount will not apply to artificial intelligence workloads that require high power density."

Related: Has Inflation Affected Hyperscale and Wholesale Data Center Power and Lease Pricing?

4.  Supply Chain: Relationships Matter More Than Ever

Delivery dates for data centers have lengthened due to disruptions in the supply chain. Developers want certainty in their delivery timelines and in 2024 will seek to use deeper partnerships and M&A to improve the predictability of their supply chain.

The recent $3 billion Compass-Schneider deal to integrate their supply chains is one example. Infrastructure funds and other capital sponsors can play a role in supporting their data center platforms, as many have relationships that span the energy and industrial sectors. These become more important as larger projects test traditional approaches to supply chains.

"The increase in demand has exceeded the capacity of traditional infrastructure deployment methods to efficiently support it, and this will continue to challenge project timelines and milestones well into 2024 and beyond," said Zech Connell, VP of Program Development with BluePrint Supply Chain, in the recent DCF Executive Roundtable.

“In 2024, the supply chain faces significant challenges in meeting heightened demand,” said Jeffrey Kanne, President & CEO of National Real Estate Advisors, the sponsor of Sabey Data Centers. “Aligning with customers on delivery timelines remains a critical obstacle. The industry grapples with 52+ week lead times for crucial data center components such as switchgear, power distribution equipment, generators, and chillers.”

“Since the pandemic's onset in 2020, significant investments in long-lead equipment have become crucial to support leasing and development activities while avoiding disruption for current customers,” said Kanne.

This won’t necessarily involve M&A, but we may see creative dealmaking with joint ventures or strategic partnership structures that bring more clarity to equipment procurement and delivery timelines.

Related:  The State of the Data Center Supply Chain in 2023

5.  More Momentum for Modular

Modular design has become an important element of data center construction, particularly in the delivery of pre-packaged power rooms and cooling equipment. With a limited supply of colocation space and wholesale capacity available, deployment of prefabricated IT modules may become a more attractive alternative.

Many of the current modular IT products target the market for edge computing. But for enterprises with real estate and power capacity, placing a module in a parking lot or warehouse may be the shortest path to an AI deployment. These factory-built form factors offer availability and speed-to-market.

It’s worth noting that the supply chain partnership between Compass and Schneider involves modular data centers. Schneider has also partnered with JLL and Lumen Technologies to identify deployment sites for modular units, while Vertiv just launched a prefab TimberMod made of mass timber for sustainabilty-focused deployments.

Stealthy startup Armada recently unveiled its plan to deploy modular data centers using StarLink satellites for connectivity. The company also announced $55 million in funding from venture capital firms, who are not often investors in digital infrastructure hardware.

Related:  Strategic AI Partnerships Ease Liquid Cooling Technology Uptake In High-Density Data Centers

6.  AI Drives Design Updates for Power and Cooling

Many AI implementations take the form of high-density zones inside facilities designed for cloud-scale density of about 10 to 12 watts per kW, or perhaps even lower.

This will change over the next several years, with the emergence of entire facilities optimized for the extreme density of AI workloads, including significant installations of liquid cooling infrastructure. In other cases, facilities might balance AI and cloud workloads.  

That’s the approach at Meta, which has reworked its data center design for a hybrid approach combining direct-to-chip water-cooling for GPUs and air cooling for cloud workloads. It also simplified its power distribution, eliminating switchgear that created capacity bottlenecks.  

“As we look to the future, the generative AI workloads and models are much more complex," said Alexis Bjorlin, VP of Engineering Infrastructure at Meta. "They require a much larger scale. Whereas traditional AI workloads may be run on tens or hundreds of GPUs at a time, the generative AI workloads are being run on thousands, if not more.”

Doing density at scale requires some adjustments. In a recent white paper, Schneider Electric outlined design guidance on adapting to AI workloads, including:

  • Using wider, deeper and taller racks to accommodate rack-level power distribution units and liquid cooling manifolds. 
  • Reviewing options and locations for coolant distribution units (CDUs) to support liquid cooling.
  • Assess flooring – especially raised floors – to support heavier racks and different types of equipment. 
  • Upgrading from 120/208 V to 240/415 V electrical distribution. Also, double down on education about arc flash risk.  

As with the rise of cloud computing, the transition to high-density AI data centers will take several years. This shift will begin in earnest in 2024.     

Related: Microsoft Unveils Custom-Designed Data Center AI Chips, Racks and Liquid Cooling

7.  Air Permitting at Scale is a Hot Potato

Doing backup power at MegaCampus scale can be complicated. This issue is likely to be felt beyond Maryland, where a state-level regulatory decision created controversy in 2023.

Aligned Data Centers canceled a planned development at the Quantum Loophole campus in Frederick County after the Maryland Public Service Commission limited the company to 70 MWs of generator capacity, rather than the 168 gensets Aligned had requested. One issue was whether the data center generators should be regulated as a power generation station, rather than individual backup power units.

As MegaCampuses proliferate, the Maryland dispute could be an early indicator of conflicts to come. Air quality boards are wary of new diesel capacity, given growing concerns about fossil fuel emissions and climate change. The data center industry is working hard on greener options for backup power, but diesel is likely to retain a central role while these new technologies become ready for Internet scale.

In 2020 Microsoft announced plans to eliminate its reliance on diesel fuel by the year 2030. Alternatives are being developed, including biofuels, hydrogen fuel cells and large lithium-ion batteries. Most of these solutions have been deployed on a modest scale.      

In the meantime, data center operators want clarity on the the ground rules. Maryland Gov. Wes Moore plans to introduce legislation to define how data centers are classified.

“We think that a definition of what is, and what is not a power plant makes a lot of sense and reduces the regulatory uncertainty holding back data center development in Maryland,” Quantum Loophole CEO Josh Snowhorn told Capacity

Related:  DCF Show: Data Center Frontier's Rich Miller Returns For a Visit

8.  Site Selection Optimizes for Green MegaCampuses

Site selection requirements are changing, and we will see data centers in new places. This has been a core thesis of DCF since our launch manifesto in 2015.

The Green MegaCampus is the hot new form factor for site selection. These projects can require hundreds of megawatts of renewable power, hundreds of acres of land, and access to recycled water. As the criteria become more exacting, finding suitable real estate becomes trickier, especially with scarcity of power and water in key markets.

Long-term access to power – particularly the renewable power coveted by hyperscalers - is the coin of the realm for site selection. In major data center markets, this is shifting development activity to new submarkets with transmission capacity to their local grid. It’s also opening doors for secondary markets with abundant land and power to book more data center projects.

Sustainability is also prioritizing sites that offer ready access to renewable energy via local generation or access to wholesale markets. Water issues are also playing a role. One example: Crane Datacenters is building its Oregon campus in Forest Grove, to the West of Hillsboro, where a water treatment plant can provide recycled water for cooling.               

More change could be on the horizon. In a recent white paper, Schneider Electric notes that 95% of AI capacity now lives in the core of the network, in data centers and HPC facilities. Over the next five years, Schneider projects that AI capacity will become more distributed, as the need for distributed inference shifts 50% of workloads to edge data centers.

If Schneider is right, the AI-powered transformation of the data center industry will continue to evolve with speed and scale.

Keep pace with the fast-moving world of data centers and cloud computing by connecting with Data Center Frontier on LinkedIn, following us on X/Twitter and Facebook, and signing up for our weekly newsletters using the form below.

About the Author

Rich Miller

I write about the places where the Internet lives, telling the story of data centers and the people who build them. I founded Data Center Knowledge, the data center industry's leading news site. Now I'm exploring the future of cloud computing at Data Center Frontier.

Sponsored Recommendations

The AI Disruption: Challenges and Guidance for Data Center Design

From large training clusters to small edge inference servers, AI is becoming a larger percentage of data center workloads. Learn more.

A better approach to boost data center capacity – Supply capacity agreements

Explore a transformative approach to data center capacity planning with insights on supply capacity agreements, addressing the impact of COVID-19, the AI race, and the evolving...

How Modernizing Aging Data Center Infrastructure Improves Sustainability

Explore the path to improved sustainability in data centers by modernizing aging infrastructure, uncovering challenges, three effective approaches, and specific examples outlined...

How Modern DCIM Helps Multi-Tenant Colocation Data Centers Be More Competitive

Discover the transformative impact of modern DCIM software on multi-tenant colocation data centers, enhancing competitiveness through improved resiliency, security, environmental...


Unpacking CDU Motors: It’s Not Just About Redundancy

Matt Archibald, Director of Technical Architecture at nVent, explores methods for controlling coolant distribution units (CDU), the "heart" of the liquid cooling system.

White Papers

Get the Full Report

Using Simulation to Validate Cooling Design

April 21, 2022
Kao Data’s UK data center is designed to sustainably support high performance computing and intensive artificial intelligence. Future Facilities explores how CFD can validated...