As Gaming Moves to the Cloud, Data Centers Focus on Latency

Nov. 22, 2019
This week’s launch of Google Stadia highlights how cloud gaming may emerge as a significant opportunity for the data center industry, particularly providers offering interconnection and edge computing services that address latency.

Gamers hate lag – also known as network latency. As more major gaming services shift to a streaming model, network performance will be critical to keeping customers happy.

This could emerge as a significant opportunity for the data center industry, particularly providers offering interconnection and edge computing services.

Streaming gaming is in the spotlight with this week’s launch of Google Stadia, which has been highly anticipated because of the enormous network and computing resources that Google can apply to delivering superior performance. The cloud delivery model allows Stadia games to be played on your TV without a console, or stream to a Chrome browser on a laptop or mobile phone.

“Using our globally connected network of Google data centers, Stadia will free players from the limitations of traditional consoles and PCs,” said Phil Harrison, Vice President and GM of Google Stadia.

Early reviews from gaming enthusiasts are mixed, but Google’s entry into streaming gaming will likely be followed by other large tech players seeking to get a piece of this market.

One thing that is immediately clear is that streaming gaming will place significant demands on network infrastructure. Google says Stadia can use between 4GB and 20GB of data per hour, with early testing suggesting that in 1080p high-definition the service tracks closer to the 20GB projection.

“So far, relatively little information has been transmitted via online gaming,” said Dr. Thomas King, Chief Technology Officer at Internet exchange DE-CIX. “The really large program files are stored on the user’s own hard drive. Cloud gaming is changing this. Since all computing operations are performed on one server, there will no longer be the need for local installation of the games. Then the home computer, smartphone, or other devices will only serve as an output medium. If you look at the increasingly realistic depictions of current games, you can readily imagine the immense amounts of data that will have to be dealt with.”

 Stadia can use between 4GB and 20GB of data per hour, with early testing suggesting that in 1080p high-definition, the service tracks closer to the 20GB projection

That could test bandwidth caps on many home broadband plans.

“You’re going to need a ton more data,” said Bryan Hill, Director of Marketing & Business Development at Interxion. “Data for gaming is not like video data. Streaming benefits from data compression, but compression is hard and doing it well takes time.

“When Netflix does compression, they do it in advance,” Hill added. “Yet when it comes to the two-way live action that gaming requires – like when that alien is coming and you need to shoot it now – even a second delay is not okay. So, data will be at a premium. Expect new ISPs and data providers to roll out higher tiers of data plans for gamers.”

As data and network requirements increase, not everyone has the infrastructure that Google can put behind an application.

“Large gaming companies with access to capital may be able to build their own infrastructure, but this can be complex and costly to maintain,” notes Eduardo Carvalho, Managing Director for Equinix in Brazil, in a recent blog post.

“Renting infrastructure from content delivery networks (CDNs) can help, but that won’t address the entire interactive gaming experience,” he continued. “While CDNs tend to have a broad global reach, they are optimized to deliver static and streaming content one-way. That can work well for downloading game content but many games also require fast connectivity back to the gaming platform for player actions.”

The Geography of Online Gaming

There’s no question that latency demands will guide the geography of gaming infrastructure, requiring attention to the location of network hubs as well as end users.

“Latency is really important,” said Daniel Golding, the Global Network Planning and Design Lead for Google. “It’s not necessarily latency to the end user, but low latency to the interconnection. We want that latency low and predictable. Proximity obviously becomes an issue.”

Google has perhaps the world’s most powerful data center network, operating 20 huge cloud campuses around the globe, as well as thousands of edge nodes to manage latency and content distribution.

For many gaming companies, housing infrastructure in major hyperscale markets is central to their strategy. Cologix said it has seen strong uptake from gaming companies in its data centers in the Montreal area, which offers strong fiber connectivity and low power costs.

“Big gaming companies see value in being in that (hyperscale) footprint for their dense connectivity requirements,” said Bill Fathers, CEO of Cologix, which builds connected data centers in regional markets. “They’re growing really quickly. Once they’re in your platform, they grow and are ferocious consumers of bandwidth. They prefer to be quite centralized, rather than distributed.”

Data on gamer and developer perspectives on performance. (Source: INAP)

Industry watchers predict this will shift over time. Data center provider INAP, which has a long history in network optimization, says game studios don’t always fully appreciate the importance of latency for their users, which suggests that they currently underprovision network resources – a pattern that will be problematic as more games shift to a streaming model.

INAP surveyed more than 200 gamers and game developers attending GDC 2019, and found a disconnect between gamers and game developers about the barriers to great online gaming. Users cited high lag/latency as the top reason for quitting a game, while developers believed bad game mechanics was the priority issue.

An Opportunity for Edge Growth?

While interconnection is a key performance driver, gaming also is a growing business for edge computing, which shifts content closer to end users.

“A low latency time depends on many factors, one of which is definitely spatial proximity,” says King of DE-CIX. “For data transmission, the speed of light is a natural speed limit. At some point, an ever-faster transmission will no longer be feasible. Therefore, other methods will be required: servers and nodes must be closer to the user. It will be crucial for cloud gaming providers to get their data and computing capacities as close as possible to their customers.”

“We’re seeing a lot of applications have to be close to gamers for performance,” said Phillip Marangella, Chief Marketing Officer at EdgeConneX, which operates a network of 30 edge data centers, primarily in regional markets.

The shift to streaming could boost demand for distributed infrastructure, according to some edge specialists.

“Streaming gaming is an end user application I’m seeing deployed today (at the edge),” said Matt Trifiro, Chief Marketing Officer of Vapor IO, who noted the revenue of Fortnite, which helped Epic Games earn $2.4 billion in 2018. “It’s a giant opportunity. It is a major trend that will drive a lot of edge computing.”

New data from EdgeGap illustrates the potential performance benefits of edge computing for gamers. In a test with Ubisoft using its Far Cry 5 game, EdgeGap found that optimizing routing to  nearby edge data centers could improve average round trips for gaming data by 58 percent, reducing round trips from 116 milliseconds to just 49 milliseconds.

Data from EdgeGap illustrating the latency improvements in a network test with Ubisoft. (Image: EdgeGap)

EdgeGap is an edge provider that helps gaming studios deploy multiplayer server instances closer to end users. The company argues that latency is more closely related to distance than to any network-based factors. Creating the relays that are physically closer to the gamers means fewer hops, fewer routers and faster reaction times. On the infrastructure front, EdgeGap is working with Packet, Stackpath, Vapor IO and Deutsche Telecom’s MobileEdgeX

One challenge for service providers is that powerful gaming hardware can test rack densities. Marangella says EdgeConneX sees rack densities of 35kW and beyond from some of its gaming clients.

While not speaking specifically about Stadia, Google’s Golding noted that gaming infrastructure often relies on GPUs, which require higher rack density.

“They run hot, so there are some specialized requirements that are sometimes outside the cold white box (of a traditional data center),” said Golding. “GPUs are still evolving very, very quickly. We may see some very different new requirements for data centers.”

About the Author

Rich Miller

I write about the places where the Internet lives, telling the story of data centers and the people who build them. I founded Data Center Knowledge, the data center industry's leading news site. Now I'm exploring the future of cloud computing at Data Center Frontier.

Sponsored Recommendations

How Deep Does Electrical Conduit Need to Be Buried?

In industrial and commercial settings conduit burial depth can impact system performance, maintenance requirements, and overall project costs.

Understanding Fiberglass Conduit: A Comprehensive Guide

RTRC (Reinforced Thermosetting Resin Conduit) is an electrical conduit material commonly used by industrial engineers and contractors.

NECA Manual of Labor Rates Chart

See how Champion Fiberglass compares to PVC, GRC and PVC-coated steel in installation.

Electrical Conduit Cost Savings: A Must-Have Guide for Engineers & Contractors

To help identify cost savings that don’t cut corners on quality, Champion Fiberglass developed a free resource for engineers and contractors.

CoolIT Systems
Source: CoolIT Systems

Selecting the Right Coolant Distribution Unit for Your AI Data Center

Ian Reynolds, Senior Project Engineer with CoolIT Systems, outlines considerations for selecting the best CDU for your needs.

White Papers

Get the full report.

Reimagine Enterprise Data Center Design and Operations

April 27, 2022
Future Facilities explores how digital twin technology can be used to virtualize and fine tune data center design.