AI’s Yellow Brick Road

Oct. 20, 2023
Phillip Marangella, Chief Marketing and Product Officer for EdgeConneX, outlines what data center providers must do to enable the full potential of AI.
Up to now, one of the most remarkable things about Artificial Intelligence is how quickly the term has become part of mainstream conversations around the globe. Many articles and papers trace AI’s origins back to the Tin Man character in the novel, ‘The Wonderful Wizard of Oz’ published in 1900.
This character in the story was a human who became a tin man but retained human intelligence and personality. The author of that book followed it up in 1907 with a sequel that featured a robot named Tik Tok, often considered the first rendering of a machine with some level of human intelligence.

It's not surprising that AI was conceptualized first in fiction, which was also true of space travel, moon landings, submarines, surveillance states, wristwatch phones, Frankenstein, and more. These concepts emerged first from human imagination, but they all needed science and technology to make them a reality.

While there have been compact, personalized examples of AI’s power in services like digital assistants, and while it has served as a misunderstood plot point in movies for a few decades, the real power and possibilities are only now being recognized for their transformational potential. Virtually every industry, every service, and every product will be disrupted to some degree by the impacts AI will have on finance, healthcare, construction, operations, IT, product design, manufacturing, logistics, and more.

In its 2023 Hype Cycle for AI, Gartner lists 29 technologies plotted from the ‘Innovation Trigger’ to the ‘Plateau of Productivity. Most of those technologies are projected to reach that level of productivity in five years or less, although history would indicate that some of them will prove unproductive or even unfeasible to deploy successfully.

The Next Great Technological Revolution

AI has been a long time coming, and its payoffs seem almost in our grasp. But none of these AI outcomes will be possible without laying the foundation by enabling machine learning to harness the intelligence to successfully develop and deploy those technologies across industries, disciplines, locations, and cultures. And that’s where data centers are once again playing a critical role in enabling another technological revolution. Data centers were essential to the proliferation of the internet globally; they empowered the pervasiveness of the smartphone and mobile computing; and the cloud was born out of centralizing and sharing compute power at scale inside data centers. Building upon all three of those technological revolutions, AI is the next major innovation that will rely on each of those three other technologies and the power, literally and figuratively, that data centers can bring to enabling the full potential that AI can bring the world.

Built to Density

While data centers will be critical to the success of AI, they also present a challenge to data center providers. The massive compute performance and energy consumption that comes from the GPUs and other specialized chips driving AI, including ASICs and FPGAs, also require massive power as the rack densities rapidly scale. Historically, power densities inside data centers have increased in very small increments. However, the rapid advancement in GPUs and the compute power that comes from them means that densities are pushing the boundaries of traditional air-cooling methodologies. While average densities have been in the single digits over the last 20+ years, AI densities range from 25-50kw per rack today and those numbers are expected to double or triple in the next 2-4 years. The key will be able to design data centers that are both built to suit customers’ specific requirements for their AI platforms but also built to high-density specifications with the flexibility to scale in both capacity and power requirements over time.

These higher densities will bring a greater need for innovative cooling solutions that don’t sacrifice the recent gains in efficiency and sustainability that we have seen across the industry. As higher-density needs emerge, more efficient cooling solutions will need to keep pace, and data center providers will need to accommodate improved air cooling, as well as direct-to-chip and liquid cooling or immersion solutions. These advances promise to offer reduced emissions and lower PUE values while dramatically increasing the computing power at the rack level and, by extension, allow for significant growth in the capacity accessible in a data center at virtually any scale. The velocity and breadth of these innovations will make it even more challenging for most companies to design, build, and operate their own data centers.

Hyperscale to Hyperlocal Data Centers

AI needs two types of data centers. On the one hand, it needs hyperscale facilities to support the large language models and AI training deployments. These larger deployments will be situated in lower cost power and real estate markets given their scale. They will often be situated on or very near to existing hyperscale cloud campuses to gain further economies of scale and interconnect with cloud infrastructure. On the other hand, inferencing deployments will be in smaller, Edge locations as that’s where end-users will query AI and latency and proximity matters much more than the training deployments. Collectively, the incremental demand for data center capacity from both types of AI deployments will conservatively add 3GW’s annually of demand to the data center market, which essentially doubles the current demand forecast.

AI has been on a path toward realization for more than a century, from rudimentary fictional characters to the real-world achievements we are starting to see today. But AI can’t reach its full potential without the data and rules that will both govern and unleash its learning capabilities. Nor can it deliver on its promise without the imagination and ingenuity of the people who build and operate data centers with all the power and capacity that AI demands today and the astonishing scale it will demand in the near future.

With AI, you might say, “We’re not in Kansas anymore”. Except that, for AI modeling and other deployments that aren’t as latency-sensitive as many cloud or content deployments might be, Kansas may turn out to be an inviting destination for a next-generation data center. The key to the yellow brick road for data center providers will be to have the capacity in the right market, with the flexibility to support the right density, available at the right time.

Phillip Marangella is Chief Marketing and Product Officer for EdgeConneX. EdgeConneX, is a global data center provider focused on driving innovation. Contact EdgeConneX to learn more about their 100% customer-defined data center and infrastructure solutions.

About the Author

Voices of the Industry

Our Voice of the Industry feature showcases guest articles on thought leadership from sponsors of Data Center Frontier. For more information, see our Voices of the Industry description and guidelines.

Sponsored Recommendations

The AI Disruption: Challenges and Guidance for Data Center Design

From large training clusters to small edge inference servers, AI is becoming a larger percentage of data center workloads. Learn more.

A better approach to boost data center capacity – Supply capacity agreements

Explore a transformative approach to data center capacity planning with insights on supply capacity agreements, addressing the impact of COVID-19, the AI race, and the evolving...

How Modernizing Aging Data Center Infrastructure Improves Sustainability

Explore the path to improved sustainability in data centers by modernizing aging infrastructure, uncovering challenges, three effective approaches, and specific examples outlined...

How Modern DCIM Helps Multi-Tenant Colocation Data Centers Be More Competitive

Discover the transformative impact of modern DCIM software on multi-tenant colocation data centers, enhancing competitiveness through improved resiliency, security, environmental...

Sashkin/Shutterstock.com

Unpacking CDU Motors: It’s Not Just About Redundancy

Matt Archibald, Director of Technical Architecture at nVent, explores methods for controlling coolant distribution units (CDU), the "heart" of the liquid cooling system.

White Papers

Dcf Afl Sr Cover 2023 01 09 12 22 13

How Four Efficiency and Density Points are Reshaping Data Center Network Architecture

Jan. 9, 2023
In a world connected by digital infrastructure, new network considerations must be taken to ensure optimal performance, efficiency, and sustainability