Facebook Plans Colossal 11-Story Data Center in Singapore

Sept. 6, 2018
Facebook is expanding its data center network to Asia with a colossal 11-story, 1.8 million square foot facility that will be one of the largest data center structures ever built.

Facebook is expanding its data center network to Asia, and doing it in dramatic fashion, with a colossal 11-story, 1.8 million square foot facility that will be one of the largest data center structures ever built.

The new $1 billion project was disclosed Thursday morning by Facebook engineering executive Jay Parikh, and reinforces several trends we’ve been tracking here at Data Center Frontier, including the super-sizing of hyperscale data center requirements, and the trend toward multi-story facilities to enable new capacity in constricted real estate markets. It also continues a massive expansion of Facebook’s network as its seeks to keep pace with the growing use of video, artificial intelligence and other data-intensive technologies.

Facebook’s design represents a startling new approach to deploy data center capacity vertically. It’s the tallest facility yet that uses a true hyperscale design, which features larger facilities that capture economies of scale.

“Singapore is one of the most vibrant and modern technology hubs in Asia,” Parikh wrote. “However, it presents a new set of efficiency challenges due to its high temperatures and humidity. To address these and other unique operational requirements, including building in a dense, urban environment, we came up with a new design and way to build this facility.”

Singapore: Home of the High-Rise Data Centers

The key challenge in Singapore is land. The large parcels Facebook usually seeks to build its multi-building cloud campuses are largely unavailable in Singapore, an island nation where land is extremely expensive.The government of Singapore has been encouraging data center operators to build taller facilities.  That’s why other data center companies in Singapore have employed a multi-story design. Google operates a five-story data center in Singapore, which is its tallest data center anywhere in the world. Digital Realty also operates several multi-story data centers in Singapore, while SingTel has a seven-story facility.

But none approach the 11-story design disclosed today by Facebook. Images of the building show cooling infrastructure housed on the roof, an approach supported by the new cooling system. The main data center appears to have an attached support building that is about five stories tall, which could house backup generators.

There have been taller buildings that house data centers, including the major carrier hotels, like 60 Hudson Street in New York and One Wilshire in Los Angeles, which provide services to the nexus of business customers in the central business district of these cities. These carrier hotels were originally office buildings, and have limited floor plates and restrictions on mechanical and electrical equipment.

Among purpose-built data centers, Google began building taller data centers in 2016, enabling it to pack more servers into the same real estate footprint, providing more bang for its buck on each of its huge cloud campuses. Colocation provider Equinix recently opened a dedicated eight-story data center facility in Amsterdam, one of the most active data center markets in Europe. There is also a strong trend toward multi-story data centers in two of the leading U.S. Internet hubs, Northern Virginia and Silicon Valley.

Facebook has traditionally built built data centers that house servers on a single story, with a second-floor “penthouse” dedicated to cooling systems. The Singapore design adapts a large-footprint hyperscale design and adds multiple stories. This approach often creates challenges in deploying cabling for power and fiber, typically using dedicated vertical risers.

New Cooling Design for Warm Weather

A key component of Facebook’s design is a new cooling system that the company unveiled in June, which allows it to cool its servers and storage equipment in warmer climates. The new system, known as StatePoint liquid cooling system (SPLC), was developed in partnership with cooling specialist Nortek Inc., and uses an approach to evaporative cooling that is new to the data center industry.

As DCF noted at the time, his type of system could be useful in Asia and other humid climates, where data centers often operate slightly less efficiently due to the use of more power-intensive cooling and dehumidification systems.

In most of its data centers, Facebook uses direct cooling, bringing filtered outside air into the data hall and circulating it through racks to remove the heat generated by servers and storage units. SPLC takes a different approach, using the outside air temperature to produce cool water, which can then be used in cooling systems. Facebook is currently using the water in a cooling coil, which cools air that flows through the racks to cool servers.

SPLC is an evaporative cooling system uses a liquid-to-air energy exchanger, in which water is cooled as it evaporates through a membrane separation layer. This cold water is then used to cool the air inside the data center.

The new system will use far less water than other indirect cooling technologies, addressing growing concerns about the use of water by hyperscale data centers, and their potential impact upon the local water resources and the environment.

“This technology minimizes water and power consumption and can maintain required temperatures without supplemental cooling,” Parikh wrote. “It can reduce the amount of water used by 20% in hot and humid climates like Singapore when compared to other indirect cooling systems. With an expected PUE of 1.19, we expect this facility to be one of the most efficient data centers in the region.”

Now Building Bigger, and Bigger…

The Singapore project also continues a trend in which Facebook is super-sizing the scale of Internet infrastructure, building bigger data centers, and larger cloud campuses. We first reported on this trend in February 2017, and the company’s construction program has accelerating relentlessly ever since.

What’s driving this growth. Facebook’s growing focus on video shifts the math on file storage and data center requirements, as HD video files are substantially larger than photos. Facebook has been scaling up its infrastructure to handle massive growth in user photo uploads, including custom cold storage facilities and the use of BluRay disks to save energy on long-term storage. Video storage can be an even larger and more expensive challenge. Google, which operates YouTube as well as a cloud platform, spends more than $10 billion a year on data center infrastructure.

It also was inevitable that Facebook would require data center infrastructure in Asia, which has been one of the fastest-growing regions for Internet use, and thus a hotbed of data center activity.

“We’re very excited about expanding our data center footprint into Asia, and it becoming part of our highly advanced infrastructure that helps bring Facebook apps and services to you every day,” said Parikh.

About the Author

Rich Miller

I write about the places where the Internet lives, telling the story of data centers and the people who build them. I founded Data Center Knowledge, the data center industry's leading news site. Now I'm exploring the future of cloud computing at Data Center Frontier.

Sponsored Recommendations

How Deep Does Electrical Conduit Need to Be Buried?

In industrial and commercial settings conduit burial depth can impact system performance, maintenance requirements, and overall project costs.

Understanding Fiberglass Conduit: A Comprehensive Guide

RTRC (Reinforced Thermosetting Resin Conduit) is an electrical conduit material commonly used by industrial engineers and contractors.

NECA Manual of Labor Rates Chart

See how Champion Fiberglass compares to PVC, GRC and PVC-coated steel in installation.

Electrical Conduit Cost Savings: A Must-Have Guide for Engineers & Contractors

To help identify cost savings that don’t cut corners on quality, Champion Fiberglass developed a free resource for engineers and contractors.

ZincFive
Source: ZincFive

Data Center Backup Power: Unlocking Shorter UPS Runtimes

Tod Higinbotham, COO of ZincFive, explores the race to reduce uninterruptible power supply (UPS) runtimes.

White Papers

Dcf Sesr Cover 2022 05 19 10 38 01 231x300

The Software-Defined Bottom Line

May 23, 2022
Over time, data center infrastructure has become infinitely more complex and more distributed. This special report, courtesy of Schneider Electric, explores the evolution of software...