What’s Next for AI in Data Center Automation?

March 20, 2019
Our Executive Roundtable panel – Chris Sharp of Digital Realty, Tim Mirick of Sabey Data Centers and John Hewitt of Vertiv – discusses how artificial intelligence and automation are impacting data center management.

Today we continue our Data Center Executive Roundtable, a quarterly feature showcasing the insights of thought leaders on the state of the data center industry, and where it is headed. In today’s discussion, our panel of experienced data center executives – Chris Sharp of Digital Realty, Tim Mirick of Sabey Data Centers and John Hewitt of Vertiv – discuss how artificial intelligence and automation are impacting data center management.

The conversation is moderated by Rich Miller, the founder and editor of Data Center Frontier.

Data Center Frontier:  There’s a growing focus on automating data center operations, a trend driven by staffing challenges and the need for remote management of “lights out” edge data centers. Do you expect to see more automation? What are the most promising approaches in this area?

Chris Sharp, CTO, Digital Realty

Chris Sharp: Absolutely we expect to see more automation. However, for us it’s less about achieving the “lights out” data center and more about allowing our operators time to concentrate on the high impact and high frequency tasks that inevitably require human intervention. There are two main areas of focus that we see the industry trying to address through automation.

The first is energy efficiency. AI and machine learning algorithms enable supervised control over mechanical cooling, gradually making changes to allow for the most effective usage of power. This automation has matured over the past decade to a point where the market has plenty of options to choose from. We’ve already seen Google roll out this type of solution to great effect, albeit in a more controlled, single-tenant environment. Effectiveness of this type of solution is largely determined by the availability of a tunable infrastructure, but even a 5 percent gain in efficiency can have a huge impact to the profitability of a data center.

The second area is predictive maintenance. This is where we see the largest room for improvement. Most companies offering this type of solution (in the data center industry) are actually offering heavily supervised machine learning algorithms, and are often only available for a product they manufacture. Real benefits in this space will come when a firm obtains a critical mass of data that aligns equipment types, power usage, performance, incident and maintenance data. This is where we will see the ability to take more of a hands-off approach, only replacing or repairing components when they actually need it.

As for achieving a “lights out” data center, we don’t see this as an ultimate goal for us. Not only do our customers require a level of data center management by on-site personnel, but you will never be able to predict when every component will need maintenance or fail. The key will be finding the right mix in allowing these new applications to assist humans in creating a better product. Our data center designs are engineered to meet current customer demand, and optimized for the efficiencies and logical touch points to meet the future requirements.


John Hewitt: Over the last 12 months we’ve seen a surge in operators deploying remote monitoring and management solutions to support lights out edge data centers. We expect this trend to continue as the number of edge deployments grows.

Remote management and monitoring has always been a smart business decision. Now, it is becoming a necessity due to both labor challenges and the complexity of managing increasingly distributed IT networks using manual processes. New cloud connected service models also are emerging to enable fast, cost-effective services for distributed IT to further enhance the ability of organizations to maintain lights-out facilities while ensuring high availability.

TIM MIRICK, Sabey Data Centers

Tim Mirick: If we define “automation” as “virtualization of the 3rd party data center experience,” then absolutely, yes. Data center capacity is most convenient and cost-effective when the data center reports out through the customers’ operational dashboard using the exact metrics with which customers are most familiar.

The most promising approach: seamless compatibility of data center management and customers’ premises management systems based on a clear tenant scope of work. Customer convenience is always important—a data center is most convenient if you don’t have to go there.

NEXT: How should the data center industry assess and manage the risk from climate change?

Keep pace with the fact-moving world of data centers and cloud computing by following us on Twitter and Facebook, connecting with me on LinkedIn, and signing up for our weekly newspaper using the form below:

About the Author

Rich Miller

I write about the places where the Internet lives, telling the story of data centers and the people who build them. I founded Data Center Knowledge, the data center industry's leading news site. Now I'm exploring the future of cloud computing at Data Center Frontier.

Courtesy of AFL

Hyperscale Rising

Alan Keizer and Keith Sullivan of AFL explore the growth and evolution of hyperscale computing from being a nice-to-have to a must-have.

White Papers

Dcf Hypertec Sr Cover 2023 06 23 10 12 23

New Data Center Efficiency Imperatives

June 23, 2023
Data center demand is increasing sharply thanks to the rise of artificial intelligence, IoT devices, automation, cloud computing, and various other use cases. As a result, power...