Roundtable: Data Center Forecasts In AI's Inflection Point Year

Sept. 20, 2023
We asked our Executive Roundtable to forecast near-term data center AI demand, looking out from the pivotal year of 2023, as created by the rapid rise of LLMs such as ChatGPT, and GPU hardware such as Nvidia's now ubiquitously in-demand H100 system.

For today's installment of Data Center Frontier's Executive Roundtable, our four data center industry leaders share their forecasts for data center AI demands in the technology's inflection point year of 2023.

Although modern AI technology has been ramping up for more than a decade through various permutations of deep learning and big data, history will show that this was the year that ChatGPT, the large language model–based chatbot developed by OpenAI and released at the tail end of 2022, running on hardware such as Nvidia's H100 GPU system, truly opened the floodgates for AI.

As is now commonly understood, AI's crossroads moment in the data center was created by the sudden rise of LLMs such as ChatGPT intersecting with a surge in GPU and TPU chip advancements, rollouts and, yes, shortages, from technology front-runners led by Nvidia and Google. The two AI giants recently announced an expanded partnership to advance the fields of AI computing, software and services.

Nvidia's partnerships are well-noted and prolific. Next year, we'll probably be talking about Nvidia taking over the AI cloud and how hyperscalers appear more inclined to join, rather than try to beat, them - but for now, we ask our four distinguished industry leaders to share their forecasts for data center AI demands in this inflection point year of 2023.

Our Executive Roundtable's expert panelists include:

  • Jim Buie, President and CEO, Involta
  • Joe Reele, VP Solutions Architects for Schneider Electric
  • Kevin Imboden, Global Director of Market Research and Competitive Intelligence for EdgeConneX
  • Matt Zieg, VP of Global Strategic Accounts for Vertiv

Now onto the second question of the week for our Executive Roundtable for the Third Quarter of 2023:

Data Center Frontier:  Given the rapid acceleration of AI technology momentum this year, what’s your 12-24 month outlook for planning or spending by hyperscale and colocation providers on any type of data center AI technology upgrades related to workload management and demand uptake from enterprise and consumer cloud customers?

Jim Buie, Involta:  We are currently bidding on over 100MW of demand related to these high density workloads.

AI will be the next growth driver for the industry with hundreds of millions, to billions, in capital building the next generation of data centers.

There is a long tail and need for the lower density data centers that exist today to host critical applications. However, the future is about high density, ability to cool efficiently, and connectivity.

Joe Reele, Schneider Electric:  When you talk about AI from the hyperscalers' perspective, there are things that need to be taken into account across the entire fabric - both IT and OT - of the data center. Meaning at a very high level, the compute, network and storage (both the technology and architecture of all), plus the power and cooling architecture, technology, and operation.

Similarly in colocation, minus the IT, there could be an uptick in "AI infrastructure," for example, liquid cooling, with much higher power density. Sure, there will be spending across all of that to accommodate the growing adoption and use, but probably a small portion of the overall, call it "BAU" spend.

Kevin Imboden, EdgeConnex:  Second-quarter results for Nvidia provide the answer: there are a record number of GPUs being purchased for AI workloads at a rate that will only be slowed by the ability to produce them!

As an industry, this will provide a strong incentive for change, with hyperscalers and colocation providers alike required to re-engineer their facilities for denser workloads beyond anything seen previously.

Guidance from public companies is a good indicator, with several explicitly outlining higher capital expenditure in the coming months to account for these requirements.

Expect continuing spend over the next two years, with optimization following as training leads to inference and a change in structure. 

Matt Zieg, Vertiv:  High-performance compute-intensive applications like AI will require a significant increase in IT investment, and that investment will be more distributed in nature to get the compute capacity closer to where the information is consumed.

The rise of AI and other high-performance computing applications will likely lead to higher spending on thermal and power infrastructure to support these applications.

Next: Data Center Investor and Provider Relations Outlook

Keep pace with the fast-moving world of data centers and cloud computing by connecting with Data Center Frontier on LinkedIn, following us on X/Twitter and Facebook, and signing up for our weekly newsletters using the form below.

About the Author

Matt Vincent

A B2B technology journalist and editor with more than two decades of experience, Matt Vincent is Editor in Chief of Data Center Frontier.

Sponsored Recommendations

Guide to Environmental Sustainability Metrics for Data Centers

Unlock the power of Environmental, Social, and Governance (ESG) reporting in the data center industry with our comprehensive guide, proposing 28 key metrics across five categories...

The AI Disruption: Challenges and Guidance for Data Center Design

From large training clusters to small edge inference servers, AI is becoming a larger percentage of data center workloads. Learn more.

A better approach to boost data center capacity – Supply capacity agreements

Explore a transformative approach to data center capacity planning with insights on supply capacity agreements, addressing the impact of COVID-19, the AI race, and the evolving...

How Modernizing Aging Data Center Infrastructure Improves Sustainability

Explore the path to improved sustainability in data centers by modernizing aging infrastructure, uncovering challenges, three effective approaches, and specific examples outlined...

Image courtesy of EXFO

Navigating the Future: Upgrading Networks in Data Centers for 400G  

Nicholas Cole, Data Center Solution Manager at EXFO, explains why the journey towards 400G and beyond is not merely about keeping pace but also ensuring that every step forward...

White Papers

Get the full report

Ethernet in Data Center Networks

Aug. 1, 2022
This white paper from Anritsu discusses Ethernet usage trends in data center networks, as well as the technologies helping operators meet growing bandwidth demands and verify ...