In this week’s Voice of the Industry, Daniel Draper of Emerson Network Power, explains the latest figures on the cost of data center downtime as quantified by Ponemon Institute.
Time is money. And data center downtime is A LOT of money. That’s what the latest findings from the Ponemon Institute indicate in their most recent edition of the Cost of Data Center Outages report.
The average cost of a data center outage in 2016 now stands at $740,357, up 38% from when the report was first developed in 2010. That’s $8,851 per minute of lost revenue, and unproductive employees (“e-mail’s down, time for some minesweeper!”).
So how did the Ponemon Institute come up with an average cost of $740,357 per unplanned outage? Well, to get that figure, the Ponemon Institute audited 63 data centers in North America that experienced an outage. Utilizing an activity-based costing model, they captured information about both direct and indirect costs including:
- Damage to mission-critical data
- Impact of downtime on organizational productivity
- Damages to equipment and other assets
- Cost to detect and remediate systems and core business processes
- Legal and regulatory impact, including litigation defense cost
- Lost confidence and trust among key stakeholders
- Diminishment of marketplace brand and reputation
Now back to the cost of downtime. Way back in 2010, the average cost of an outage was calculated at $505,502. So what explains the quarter of a million dollar increase in costs? Well, think back to 2010 and how much internet based technology we used (or didn’t use as the case will show). In 2010, I had a Facebook account, as did 500 million others around the world, but now Facebook has 1.5 Billion profiles. 2010 was the year the first iPad came out. Cyber Monday accounted for less than a billion dollars in sales. Today, over $2 Billion of commerce happens online on just that one day. Cable cord cutters are growing and streaming media is quickly becoming mainstream in households all across the country.