The Future of Data Centers: An End-to-End Operating System Built on Clean Data

Oct. 2, 2023
Brian Kortendick, Director, Strategy & Growth for MCIM by Fulcrum Collaborations, explains why centralizing and standardizing operational data empowers data centers to meet future challenges and provide dependable and sustainable global infrastructure.
With the continued rapid expansion of the data center market along with the proliferation of AI, we face an even greater challenge to deliver on internal and external service level agreements (SLAs) at scale.
As the need for the industry to digitize everything accelerates, clean and accurate operating data will be critical to enable business decisions on data center infrastructure and operations.

The need for clean data is nothing new for the industry, it just has not been available. Data centers are inherently complex to manage and operate. So many elements and assets are interrelated and interdependent and every interaction, from installation to routine maintenance, creates risk. This risk is exacerbated by the inability to collect clean operating data, provide full-stack visibility, and enhance BI and Generative AI.

The State of the Industry

The data center industry is at the core of all our world, housing all our IT and storage assets, critical applications, and the world’s data repository. Yet, we have not advanced much beyond the archaic processes, methods, and procedures of decades past. This has been largely due to outdated, disjointed, and siloed data collection systems, each with their own methods of classification and structure. This has led to manual, error prone, time consuming processes that ultimately produce questionable data that has hindered informed decision making at the executive level.

The reliance on antiquated data collection methods, a lack of connected systems, as well as a lack of asset performance data, has perpetuated the situation to where we are relatively blind in determining our risk tolerances. The most conservative approach possible to reliability is then taken, a complete risk avoidance measure. With the industry continuing to expand at such an aggressive growth rate, the lack of meaningful data to back decision making is not a sustainable situation.

Managing assets and operations across global data center portfolios with spreadsheets, voluminous amounts of procedures as well as the snowflake methods must come to an end. Furthermore, with regulatory pressures increasing globally, such as the EU’s Corporate Sustainability Reporting Directive (CSRD), the transition to a centralized operations system becomes imperative.

The implementation of a central, end-to-end operating system should be a critical part of data center deployments going forward, being treated at the same level of importance as the facility layout, design, construction, fiber providers and power distribution.

The Data Center Operating System

To meet the challenges before us, we need industry leading management systems and a software landscape across the data center. Each data center operator must have a shared foundation so all functionalities can be integrated, obtained, and visible through a single integrated source of truth. The need to digitize everything is imperative, moving us towards predictive maintenance and full-stack visibility into critical assets. Having a central operating system will truly move the industry to a predictive data center end state. 

Optimizing by Digitizing

The data center operating system must be intuitive enough to drive simplicity in data collection and reporting.

It must provide a comprehensive solution for technicians, supervisors, and executives, allowing them to optimize operational processes and data collection and to facilitate real-time decision-making capabilities in the boardroom.

Data quality is a huge challenge. According to a report in Harvard Business Review, just 3% of the data in a business enterprise meets quality standards. Not only that, but joint research by Carnegie Mellon University and IBM found that 90% of data collected by businesses is “dark data,” collected but never successfully used for any strategic purpose, not used in any way to derive insights or for decision making. For data centers we need to flip "dark data" to 10% and clean data to 90%.

The crucial factors in a high performing data center portfolio is having access to clean, curated, and connected data. These factors help executives gain greater insights into their operations, and make truly data-driven capital planning and strategic asset management decisions, ensuring minimized risks, unswerving services, and uninterrupted uptime.

Predictive Data Center

The industry must embrace the development of an industry asset performance benchmarking system.

It is critical to have global visibility into asset reliability performance as we move more and more towards the future of the predictive data center. Not only should the data be centered on reliability, but also on sustainability as we move to reporting embodied carbon and energy use footprint. This need will be even more important with the explosion of AI and its true impacts on the data center. Today, we are in the crystal ball stage on AI’s global impact on our industry.  We need to step up to the challenge before us.

The Bottom Line

Studies routinely show that companies with higher levels of digital maturity perform better than their peers. Across all categories - revenue, net profits, return on invested capital (ROIC), and growth - companies with higher levels of digital maturity are leading their industries thanks to improved efficiency, quality, and ability to adapt quickly to market changes.

As the data center industry evolves, the need for clean, curated, and connected data has never been more critical. By focusing on a centralized system that standardizes operational data, the industry can meet the challenge ahead, providing reliable, efficient, and sustainable infrastructure for the digital world. Data center executives who embrace an end-to-end data center operating system, implement predictive analytics, and use industry standards on data collection and asset naming conventions to ensure clean classification and accurate benchmarking data will excel in decision making compared to those relying on inconsequential data. Standardization is the key in providing critical insights through real-world industry operating data to ensure system reliability, operating efficiency, carbon reduction and enabling critical business decision making to ensure a sustainable future.

Brian Kortendick is Director, Strategy & Growth, MCIM by Fulcrum Collaborations. Contact MCIM to learn more about their data center solutions. 

About the Author

Voices of the Industry

Our Voice of the Industry feature showcases guest articles on thought leadership from sponsors of Data Center Frontier. For more information, see our Voices of the Industry description and guidelines.

Sponsored Recommendations

Get Utility Project Solutions

Lightweight, durable fiberglass conduit provides engineering benefits, performance and drives savings for successful utility project outcomes.

Guide to Environmental Sustainability Metrics for Data Centers

Unlock the power of Environmental, Social, and Governance (ESG) reporting in the data center industry with our comprehensive guide, proposing 28 key metrics across five categories...

The AI Disruption: Challenges and Guidance for Data Center Design

From large training clusters to small edge inference servers, AI is becoming a larger percentage of data center workloads. Learn more.

A better approach to boost data center capacity – Supply capacity agreements

Explore a transformative approach to data center capacity planning with insights on supply capacity agreements, addressing the impact of COVID-19, the AI race, and the evolving...

Photon photo/Shutterstock.com

In the Age of Data Centers, Our Connected Future Still Needs Edge Computing

David Wood, Senior Product Manager – Edge Computing at nVent, explains why edge computing will play a pivotal role in shaping the future of technology.

White Papers

Get the full report

Top 40 Data Center KPIs

July 7, 2022
A new white paper from Sunbird outlines 40 of the most critical data center KPIs that managers should monitor. The report covers KPIs from nine areas: capacity, cost, asset, change...