Hyperscale Needs an Edge, Too

Oct. 17, 2019
Cloud, social, SaaS and mobile companies will benefit from strategic, proximity-based edge computing infrastructure. Phillip Marangella of EdgeConneX explores the factors guiding hyperscale demand at the edge.

Chief Marketing Officer at EdgeConneX, Phillip Marangella, continues his exploration of edge computing and asks this question: What about hyperscalers? According to Marangella, hyperscale needs an edge, too. 

Phillip Marangella, Chief Marketing Officer, EdgeConneX

Conventional wisdom perceives edge data centers as relatively small facilities typically deployed in Tier II and Tier III markets,and often in support of content and network providers, and local hosting providers. Yet, in view of the fact that the purpose of edge infrastructure is to deliver successful business outcomes, the edge is really wherever the customer needs it to be, regardless of its size, workload or location. Ultimately, the edge comes down to be the lowest latency demarcation point between service delivery and consumption, and it will change in size and location relative to the service being delivered and the devices being used.

But what about hyperscalers? Wouldn’t large cloud, social, SaaS and mobile companies also benefit from proximity-based, edge infrastructure that is strategically located nearest to the end-user’s point of access? According to Forrester, “Applications work best when they’re close to the point of interest. Because latency and bandwidth demands push tech to the edges, the applications performing the business functions must go with it.”

Yes, hyperscale needs an edge too, both literally and figuratively.

By localizing data acquisition and control functions, as well as the storage of high bandwidth content and applications in close proximity to the end user, edge computing facilities circumvent the distance, capacity constraints, multiple network hops, and centralized processing loads that exist in traditional internet architecture. This ensures the lowest latency data delivery with improved quality of service and increased security, while reducing backbone transport costs.

There’s no question that today’s tech giants will continue to build greater levels of compute capacity in the form of hyperscale data centers. Moreover, market demand for their products and services will make it incumbent upon them to build these facilities with increasing speed-to-market. Cisco estimates that in two years, hyperscale data centers will account for 55% of all data center traffic, 65% of data stored in data centers, and 69% of all data center processing power. According to MarketsandMarkets, this growth will only continue as the hyperscale data center market is expected to reach nearly $81 billion by 2022.

In this heightened, high stakes business environment, hyperscalers also need to bring their storage and compute infrastructure closer to end users to lower network costs and improve the performance of their products for the customers in the regions in which they’re deployed. Doing so means that those providers that deploy their services as close to their end users as possible can gain an edge on their competitors from the improved performance and increased quality of experience resulting from the lower latency accessible through edge computing.

There’s no question that today’s tech giants will continue to build greater levels of compute capacity in the form of hyperscale data centers.

But don’t listen to me, listen to voices from some of the world’s leading service providers on their views related to the edge:

 “As nearly everyone and everything gets connected, the data that is required to function in the digital world risks being congested in the core or, even worse, caught up in large-scale cyberattacks. As a result, the world is now realizing just how important the real estate at the edge can be.”

– Tom Leighton, CEO, Akamai

“…we don’t think of hybrid as a stopgap as a move to the Cloud. We think about it as the coming together or distributed computing, where the Cloud and edge computing work together, not just for old workloads, but most importantly for new workloads.”

– Satya Nadella, CEO, Microsoft

“…there are 3 broad reasons local data processing is important, in addition to cloud-based processing:

  1.  Laws of Physics: applications make interactive and critical decisions locally because it takes time to send data to the cloud, and networks don’t have 100% availability.
  2. Laws of Economics. Local aggregation and filtering of data allows customers to send only high-value data to the cloud for storage and analysis.
  3. Law of the Land. Some governments impose data sovereignty restrictions on where data may be stored and processed.”

– Werner Vogels, CTO, AWS

A single (autonomous) test vehicle can generate petabytes of data annually. Capturing, managing and processing this massive amount of data requires an entirely new computing architecture and infrastructure.”

– Nvidia, Self-Driving Safety Report 2018

Since 2010, EdgeConneX has been building out the edge based on our customers’ unique requirements, and how they define the edge can range from 10kW to more than 100-plus MW facilities.  These network, content and cloud providers will continue to drive the edge’s expansion, while many proofs of concepts related to autonomous vehicles, smart cities, and other IoT use cases will also help add new flavors to the edge, today and in the future.

The key is to be flexible, fast and scalable to support the dynamic requirements for these edge use cases across the full continuum of data center needs. The goal is to bring the services closer to the customer, rather than the customer to the service, and provide a localized and more proximate solution for service providers and customers globally.

Phillip Marangella is Chief Marketing Officer at EdgeConneX

About the Author

Voices of the Industry

Our Voice of the Industry feature showcases guest articles on thought leadership from sponsors of Data Center Frontier. For more information, see our Voices of the Industry description and guidelines.

Sponsored Recommendations

How Deep Does Electrical Conduit Need to Be Buried?

In industrial and commercial settings conduit burial depth can impact system performance, maintenance requirements, and overall project costs.

Understanding Fiberglass Conduit: A Comprehensive Guide

RTRC (Reinforced Thermosetting Resin Conduit) is an electrical conduit material commonly used by industrial engineers and contractors.

NECA Manual of Labor Rates Chart

See how Champion Fiberglass compares to PVC, GRC and PVC-coated steel in installation.

Electrical Conduit Cost Savings: A Must-Have Guide for Engineers & Contractors

To help identify cost savings that don’t cut corners on quality, Champion Fiberglass developed a free resource for engineers and contractors.

Meta AI
Source: Meta AI

Data Centers and Renewable Energy: A Sustainable Future

Kevin Imboden, Global Director of Market Research & Intelligence for EdgeConneX, explains how harnessing renewable energy resources can significantly reduce the carbon footprint...

White Papers

Get the full report

Enhancing Resiliency For the Energy Transition

Nov. 14, 2021
This white paper from Enchanted Rock explores how dual purpose microgrids can offer resiliency and stability to the grid at large.