Roundtable: 4 Data Center Experts Assess Where AI is Headed

March 27, 2023
The AI conversation is everywhere. In our DCF Executive Roundtable, four data center experts offer their take on the meaning of our current "AI moment" and where it may take us.

The AI conversation is everywhere. Everyone is talking about artificial intelligence, and how ChatGPT, Bard, MidJourney and other language models might change business and society.

The rise of generative AI has been driven by hardware and data center. So who better to asses the meaning of our current "AI moment" than four data center experts? Our quarterly Data Center Frontier Executive Roundtable, features the insights of four experts, all seasoned C-suite executives. In addition to the current state of AI, our First Quarter 2023 roundtable will explore several other relevant topics: pricing trends for data center space, the potential of hydrogen as an alternative power option, and the future of massive "MegaCampuses."

Here’s a look at our distinguished panel:

Each day this week I’ll moderate a Q&A with these executives on one of our key topics. We begin with our panel’s take on how the huge interest in generative AI may bring changes in data center hardware and design.

Data Center Frontier: The arrival of generative AI technologies, popularized by ChatGPT, has the potential to disrupt search and other leading Internet businesses. How might the huge interest in generative AI bring changes in data center hardware and design?

George Slessman, DCX:: If we presume that the current state of the art in deep learning is the paradigm advanced for learning systems in the near and mid-term, the Data Center (capital D, capital C – full stack, space, power, cooling, network, storage, compute) will see dramatic changes; in use, architecture and location. 

Specialty accelerators (GPU, TPU, IPU, etc) will be the defining feature for compute node architecture and for data center power design with significant power concentration and utilization (more and higher), distance between accelerators will be the driving force to increase density - well beyond current standards and locations will need to be highly distributed to support low latency at scale inferencing with a number of very large scale training nodes.  

Traditional enterprise and service provider data center design will need to be replaced or completely refactored to support this paradigm. I believe Facebook announced this a few months ago… DCX has built an entirely new Data Center platform on this premise.

Nancy Novak, Infrastructure Masons: While the technology may be disruptive, from a data center perspective it will still come down to power and cooling. For example, AI requires faster processors than current data center resident servers. For the data centers that support AI applications this means two things: we are going to need a lot of power and the heat generated will make cooling even more important than it already is.  

AI’s ravenous power and cooling requirements will impact facility design but perhaps not in the way we might expect. Certainly, we can expect them to pull technologies that, up to now, have gained only moderate traction in the industry like liquid cooling at the rack level. But from a design perspective we may see a proliferation of smaller facilities as the technical, and consumer needs for AI-driven functionality would seem to lend themselves to a higher degree of decentralization than we’ve seen to date.

If this trend does become reality, we’ll see end users reorienting many of their priorities. For example, if I have a dispersed quantity of facilities, the failure of one doesn’t have the same impact as an outage does today, so the decision to use primarily N-based designs becomes more economically justifiable. 

I think the AI-driven changes to data center design and hardware will certainly evolve over time, but in significantly compressed increments which will place a great deal of pressure on providers to achieve a level of adaption which will provide them with the nimbleness required to respond to a quick changing landscape.

Chris Downie, Flexential: From answering customer service inquiries to creating legal documents, ChatGPT has already proven to be a transformative solution across industries. Since its inception, our hyperscale customers have been seeking environments that support these capabilities; to adhere to this need, we’ve taken proactive steps to offer AI-optimized data center infrastructures.

For instance, we’re driving up power densities in our core facilities to support on average 10-12kW per rack for new workloads, with some AI workloads requiring well over 40kW per rack. We’re modifying our physical environments to support variables, such as increasing cabinet heights and trialing new liquid cooling techniques.  

At Flexential, we work with customers to ensure they get what they need today and as their needs grow. As major tech companies accelerate AI rollouts, applications will require more computing capacity and in turn, clients' connectivity needs are evolving.

Given this, Flexential is continuously rolling out new interconnection solutions. Most recently, we announced an enhancement for Flexential cloud customers with the enablement of three additional data centers in Charlotte, Nashville, and Louisville. This expansion empowers Flexential cloud customers with a superior user experience, enhanced network performance and the ability to extend applications and workloads beyond on-premise data centers. In 2023, we’re going to continue to look for ways to empower customers' IT journey, helping them prepare for - and succeed in - the AI-driven future. 

Tim Mirick, Sabey Data Centers: Innovations in AI and associated technologies are driving compute cycles up and to the right as fast as any development we have seen in the last decade. This is going to drive chip innovation, leading to increased density and heat for data centers to address. Solutions like immersion and direct-to-chip liquid cooling could be perfectly positioned to answer these issues.

The best data center operators have been thinking about liquid cooling for a long time, but traditional colocation still accommodates a broad range of densities and that keeps the optimization of air cooling as the focus. As AI technology drives innovation in hardware and more liquid-cooled options are available, facilities will be able to plan for denser environments and drive efficiency. Ultimately, this will help with per kW cooling costs and lower our energy use.

NEXT: Trends in pricing for data center space and power.  

Keep pace with the fact-moving world of data centers and cloud computing by following us on Twitter and Facebook, connecting with DCF on LinkedIn, and signing up for our weekly newspaper using the form below:

About the Author

Rich Miller

I write about the places where the Internet lives, telling the story of data centers and the people who build them. I founded Data Center Knowledge, the data center industry's leading news site. Now I'm exploring the future of cloud computing at Data Center Frontier.

Sponsored Recommendations

Get Utility Project Solutions

Lightweight, durable fiberglass conduit provides engineering benefits, performance and drives savings for successful utility project outcomes.

Guide to Environmental Sustainability Metrics for Data Centers

Unlock the power of Environmental, Social, and Governance (ESG) reporting in the data center industry with our comprehensive guide, proposing 28 key metrics across five categories...

The AI Disruption: Challenges and Guidance for Data Center Design

From large training clusters to small edge inference servers, AI is becoming a larger percentage of data center workloads. Learn more.

A better approach to boost data center capacity – Supply capacity agreements

Explore a transformative approach to data center capacity planning with insights on supply capacity agreements, addressing the impact of COVID-19, the AI race, and the evolving...