DCF Tours: Sabey Data Centers' SDC Ashburn, Data Center Alley, Virginia

March 12, 2024
Located in the heart of Loudoun County, SDC Ashburn encompasses a 38-acre campus in the world’s densest connectivity corridor, commonly referred to as Data Center Alley.

I'm greeted on a crisp, bright day early last November by Sabey Data Centers' Michael Whitlock, General Manager of the SDC Ashburn data center campus, at the site's main customer entrance. Located at 21741 Red Rum Drive in Ashburn, Virginia, SDC Ashburn encompasses a 38-acre campus in the world’s densest connectivity corridor in the state's Loudoun County, commonly referred to as Data Center Alley. 

Security and the SOC

As my visitors' badge is slipped to me by security personnel through a metal tray under the bulletproof glass at the lobby desk, Whitlock emphasizes the importance of physical security in data centers - how it is the first point of contact for customers. "There's six times that you have to show that you are who you say you are to get in," he says, adding, "All of us are the same as it relates to security and customers assets." 

For security, SDC Ashburn employs multi-tier access control measures and procedures including: 24/7 onsite security personnel; perimeter security including setbacks, berms, and fencing; secure access checkpoints at every door; CCTV cameras positioned throughout the campus; mantraps located at each building entrance; and biometric locks on outside doors. 

Whitlock notes that security is a top priority in data centers for multiple reasons. "You'll notice that, in the way this building is laid out," he says. "There's berms to keep people from coming in, the K Rated fencing that surrounds the perimeter of the facility. Things like that that, we do to make sure the customer is at ease when they have their product here." 

He adds, "Security is also the first person that you talk to in a data center. They're the first person you're going to meet and introduce you to our company. So that has to be done with the utmost graciousness to make you feel warm and welcome, and guide you to where you need to go."

Two of three buildings on the SDC Ashburn campus are now completed and leasing colocation and powered shell data center space, offering tenants over 36 MW of power and access to multiple connectivity options. With single or multi-tenant buildings and private suites and data halls available, the operator says that each building on its Ashburn data center campus "is built on the foundations of sustainability, security, and durability," based on a modular design that makes it easy for customers to scale with flexibility and customization as their business grows. 

Sabey Data Centers' commitment to modular data center design is evidenced by the multi-tenant data center specialist this month announcing a partnership with ECLAIRION, a provider of modular, containerized data center solutions who is recognized as France's first data center provider dedicated to high-density colocation hosting within container modules. A new mutual referral agreement partnership allows both companies to provide solutions to potential customers on a global basis. 

Paris-based ECLAIRION specializes in ecologically sound and modular high-density data centers for high density IT equipment hosting. The company's approach is tailored to the needs of intensive computing sites, including those requiring high-performance computing (HPC) and artificial intelligence (AI) capabilities.

Meanwhile, an essential part of the ECLAIRION value proposition is the company's commitment to improving environmental impacts, "aiming to achieve carbon neutrality through the use of decarbonized energy supplies and innovative cooling technologies that significantly reduce energy consumption," as stated in a press release.

“We are excited to partner with ECLAIRION," commented Rob Rockwood, President at Sabey Data Centers. “As our clients increasingly demand the additional capacity and power required from AI-based solutions, our alliances with ECLAIRION will enable us to flexibly and rapidly meet their needs. In addition, ECLAIRION’s approach to sustainability aligns with our corporate goals,” Rockwood added. “This partnership will benefit our companies, our clientsm and the environment as well.”  Rockwood noted that Sabey has seven data centers that have achieved Energy Star compliance, as part of the company’s focus on achieving net-zero carbon emissions by 2029.

Passing beyond the turnstile system through the bypass door used for facility tours, I notice on hallway signage that SDC Ashburn's compliance roster includes adherence to standards for: HIPAA and HITECH; PCI DSS; SSAE 18 SOC 1 Type 2 and SOC 2 Type 2; and ISO 27001 Certification, among other codes. 

We stop inside SDC Ashburn's network operations center. "This is our Sabey Operations Center," emphasizes Whitlock. "Other companies may call it a NOC; we just put a little twist on it, to say SOC. Here's where we monitor our environmentals."

He continues, "We have a SOC on the East Coast and the West Coast [in Ashburn and Quincy, WA]. We're staffed 24/7 on both sides of the U.S, and we also have engineering that's here [in Ashburn] 24/7 as well now. Like a help desk, this is the number that you call if you have things going on in your space you're uncertain about. The engineers have the same rights to see the same thing Ms. Diane [Sabey's associate monitoring the SOC] is seeing. It's just a second layer of protection to ensure that nothing slips through the cracks. Because in data centers, every second is important."

 

MPOE Interconnection and Carrier Row

Whitlock ushers me down the hallway to the data center's main point of entry (MPOE) and carrier equipment room, where outside plant fiber cables from ISPs enter from a floor-located open conduit to snake their way up and over cable trays hanging from the ceiling. "This is what we call the carrier row," says SDC Ashburn's GM."They put their infrastructure right here: that's where the internet meets the customers."

Featuring dual campus fiber entries and redundant pathways for on-site, carrier-class Meet-Me-Rooms, SDC Ashburn is a carrier-neutral facility which facilitates interconnection between multiple telecommunication carriers for both local metro and long haul networks. Currently providing connections to more than five Tier 1 providers, SDC Ashburn's supported networks include those of Cogent, CrownCastle, Fiberlight, PacketFabric (SDN), and Zayo (lit and dark fiber), with additional carriers regularly building-in. 

As we stand inside the MPOE and carrier equipment chamber, Whitlock explains, "We manage those relationships to get those carriers in here. Their framework goes on these three rows of open cabinets; then on the backside is our customers, just in this building only." He clarifies, "What happens is: if Customer B wants to do business with Carrier A, they make a phone call to us, do the proper paperwork and say, I want you to connect me." He continues, "In running the building, all we do then is take a fiber cable to go from there to their stuff."

Whitlock adds, "It's all based on designated ports. We receive a LOA, letter of authorization, so we don't touch customers' equipment for this process unless the customer gives us permission to do so. That's what they call 'remote hands' - when we're doing something for a customer because they can't tend to it, or don't want to tend to it, they lean on us for our expertise to make sure that it is carried through."

We talk about the campus' dark fiber redundancy, spanning to other points in Ashburn, and Whitlock reviews the differences between dark and lit fiber, as well as how customers tap into carrier hotels for connectivity: "A lot of these customers, they want to get to the mothership, [meaning] carrier hotels such as CoreSite and Equinix, to get out to the rest of the world. For redundancy, a lot of customers will tap in here, they'll tap into data center B, they'll tap into data center C, and then they'll get out to one of the larger carrier hotels. That's how they make their business, by having a smorgasbord of carriers that customers can choose from."

Whitlock goes on, "But for data centers such as ours, we complement our customers by understanding what carriers they're looking for - all in all, it's a business. If I were to get carrier B to come in here and charge them monthly rent and they're not getting any revenue from it, it would be a total waste of time for them. So we wait and plan and have conversations with the potential customers on what providers they're looking for, and then my job is to go out and make those connections, and to get those providers in here to accommodate our customers."

Walking me through the 23,000 sq. ft. expanse of a 3-power data hall leased to multiple customers, past fan walls that extend all the way down the hall's length, Whitlock explains that the SDC Ashburn campus is sold out. "The campus is completely full. Building C and Building B are entirely sold out."

He clarifies, "We sold out during the pandemic and would be up and rolling with the next building to come on our campus, a 54 MW facility, but that's being put on hold because of the Dominion situation. But if that didn't happen, we'd probably be halfway sold out of that right now."

This month, Vultr, among the world’s largest privately-held cloud computing platforms, announced the expansion of its Seattle cloud data center region at Sabey Data Centers’ ​​SDC Columbia location. Vultr’s expansion includes a significant new inventory of NVIDIA HGX H100 GPU clusters, available both on demand and through reserved instance contracts. 

Having recently completed the third of up to nine possible buildings across 130+ acres, Sabey’s SDC Columbia location utilizes cost-efficient, sustainable hydropower. With a 100 Energy Star rating for three consecutive years and a market-leading annualized power usage effectiveness (PUE) of 1.15, Sabey asserts that its SDC Columbia is the most efficient data center in the region, enabling Vultr to efficiently scale cloud computing and cloud GPU capacity with renewable power. 

The expansion offers Vultr customers the opportunity to harness the power of HGX H100 clusters for their AI workloads, while also choosing the cleanest, most energy-efficient cloud GPU option, enabling them to meet their ESG goals. 

"We chose Sabey for our latest expansion of NVIDIA GPU capacity because of our shared goal of reducing the carbon footprint of AI training and inference,” said J.J. Kardwell, CEO of Vultr’s parent company, Constant. “This clean, renewable, hydro-powered data center enables Vultr to deliver Cloud GPUs optimized to meet the needs of customers with the highest standards for compliance, sustainability, and price-to-performance.”

“We are thrilled to welcome Vultr to our customer roster as the first cloud computing platform to offer cloud GPUs in our SDC Columbia data center,” said Sabey Data Centers' President Rob Rockwood. “Given the crucial role data centers and AI play in our lives and in shaping humanity’s future, it’s important that we power them as sustainably as possible. By joining forces with Vultr, together we bring much-needed resources to innovation teams in a way that benefits business, society and the planet.”

 

Energy Efficient Cooling Systems

Sustainability hallmarks of SDC Ashburn's Energy Star Certified data center facilities include all-LED lighting and a low PUE design. The company says the campus' energy efficient cooling systems utilize "meaningful innovations" to decrease overall power usage, while ensuring the optimal environment for customers' computing, network, and storage equipment. 

Such systems include: evaporative and economizer cooling; hot aisle/cold aisle containment systems; rooftop air handling units; and a chilled water system with CRAH [computer room air handling] units for Building B. Operationally, the campus facilities include 24/7 critical environment monitoring with mission-critical responsiveness to changes in the data center environment. Fire detection systems are installed above and below racks.

We proceed through SDC Ashburn's expansive CRAH gallery. Whitlock remarks, "Some folks would give you a lot of jargon, but the way I look at this is, when I tell you about a CRAH unit, the air handler is usually in conjunction with water. For a CRAC [computer room air conditioner] unit, the computer room AC is usually associated with direct refrigerant with a direct compressor on it." 

He continues, "Here my form of refrigerant is water and if you look above your head, you'll see twelve-inch lines that are full of cold water. One side may come in at 48 degrees [Farenheit] and leave at 56 degrees or somewhere in that vicinity. The amount of heat transfer that goes into these air handlers is usually determined by the amount of load that we have inside the data hall."

Directing my attention to more infrastructure, Whitlock goes on, "These air conditioner units are what provides cooling to the data halls. This is no different than what you have in your home, it's just a lot bigger, and it gets its main source of water from an air cooled chiller that sits up on our roof [which] just provides cold water. There's a supply and a return and just like at your house, we have filters in here, and as you can see, they're sparkling white, nice and clean."

Later on the floor of a data hall, beyond a cage of server cabinets where technicians are with great heat and noise load-testing banks of plugs, Whitlock explains the principles of hot air containment and heat transfer: "The backside is where it exhausts its heat; this door will be closed and it will contain all the hot air. As all this is closed up and sealed nice and tight, the hot air exhaust from the back goes up through the ceiling, hits those fans and those air conditioning units that I showed you initially." 

Walking me past (and in some cases through) the infrastructure under review, he continues, "That's where the delta becomes the ten to twelve degrees in the water; as I told you, it comes in at 40-something and leaves at the high 50s, mid 60s, because of the heat transfer." We're now standing directly in front of the fan wall. Whitlock continues, "All that hot air goes back there, mixes with the warmer air, and is then separated and pushed out as the cold air you're feeling right now."

 

Power and Electrical Systems

SDC Ashburn's redundant electrical systems are geared to ultimately provide a highly resilient environment for 70+ megawatts (MW) of aggregate power, delivered to the campus with up to 2.85 MW of critical power per module. Campus Building B is supported by 2.25 MW capacity diesel generators with 48-hour run time at peak load. Building C is supported by 2 MW capacity generators with 72-hour run time at peak load. 

A 300 megawatt (MW) Dominion Energy substation located immediately onsite helps to support clients’ current and future power requirements.

As we stand next to it, Whitlock describes how the 300 MW substation feeds the campus buildings underground through pad-mounted switches, metering cabinets, and finally a step-down transformer. He explains, "There's two different types of transformers, step-up and step-down, depending on what you're looking to do. That transformer steps the power down coming from the substation to 480 volts, to the voltage that we use throughout the facility."  

I then reference DCF Editor at Large Rich Miller's 2023 article analyzing the data center power situation in Virginia, describing whereby Dominion Energy will find its way out of it dilemma of having ample power generation, but not enough transmission lines for delivery, through upgrades and additions currently happening and to be completed by 2026.

Whitlock affirms, "People always say, 'Dominion has no more power.' Dominion has enough power. They don't have enough transmission lines to deliver it to the designated spot. These transmission lines need to be doubled and some of them are being upgraded from a 250 kV rating to a 500 kV rating." 

Other key features of Sabey Data Centers' power footprint in Ashburn include electrical backup systems in N+1 configuration; redundant HVAC systems designed for immediate failover; and an on-site fuel supply for emergency events (with additional supply through emergency fuel contracts). 

Whitlock explains how multiple generators are used in a distributed, redundant topology for the power system: "We have about 20 generators onsite, ten on each side. We have five different lineups to make up four when it's all said and done. That extra lineup is there just to add comfort. When one of the other lineups feels like it doesn't want to play ball anymore, it steps in."

 

Data Center Industry PR Discussion

During the tour, Whitlock and I discuss the Prince William Digital Gateway data center development and ongoing local controversy over that project and others like it.

"The data center market has improved their game on relationship-building with residents," he opines. "I totally feel that Sabey [is] making provisions based off what residents are saying and we're trying to bridge that gap."

Whitlock goes on, "Part of my job as the GM is community relationships, to be involved with the politics of the counties that we do business in and to do relationship-building and reach out to communities. We do a lot of things. We're part of a PTO program with an elementary school that we've sponsored where we do fun runs. We are part of the robotics class. And we don't just do it here, we do that all across our portfolio. That's what we do: We are good stewards as it relates to building communities and establishing who we are as a company in all the communities that we do business in." 

He adds, "But if the data centers don't get out there and speak that language and let people know about the opportunity that's in store for them when [the industry] comes to their counties, then it won't work."

Nearing the tour's end, Whitlock notes Sabey Data Centers' involvement in outreach programs to increase diversity in the industry. "I'm the president of the Potomac Chapter for AFCOM," he states. "We have relationships and Sabey has relationships with Northern Virginia Community College, who has created a DCO program for data center operations. We've hired two kids out of that program up to now."

Whitlock reveals that he's also on the NVCC advisory committee, involved in planning DEI outreach programs. "We have one young lady who works in our operations team, and we have a couple ladies who work for the construction side of Sabey," he says. "So we definitely do what we need to do to make sure that everybody's getting a fair shake, and to ensure that everybody's getting the knowledge and the opportunities that they should be given."

Escorting me outside the building at the end of our tour, Whitlock reminds me that the data centers in Loudoun County benefit residents with lower taxes and improved infrastructure. He notes that the business tax paid by data centers in the Loudoun allows for new fire stations, schools, and libraries, and credits $1,500 back per family. 

He concludes, "This is the most expensive county in the United States, and this is the anomaly: The price of my house is going up, but my taxes are going down. Where else does that happen?"

 

Keep pace with the fast-moving world of data centers and cloud computing by connecting with Data Center Frontier on LinkedIn, following us on X/Twitter and Facebook, and signing up for our weekly newsletters using the form below.

About the Author

Matt Vincent

A B2B technology journalist and editor with more than two decades of experience, Matt Vincent is Editor in Chief of Data Center Frontier.

Sponsored Recommendations

Guide to Environmental Sustainability Metrics for Data Centers

Unlock the power of Environmental, Social, and Governance (ESG) reporting in the data center industry with our comprehensive guide, proposing 28 key metrics across five categories...

The AI Disruption: Challenges and Guidance for Data Center Design

From large training clusters to small edge inference servers, AI is becoming a larger percentage of data center workloads. Learn more.

A better approach to boost data center capacity – Supply capacity agreements

Explore a transformative approach to data center capacity planning with insights on supply capacity agreements, addressing the impact of COVID-19, the AI race, and the evolving...

How Modernizing Aging Data Center Infrastructure Improves Sustainability

Explore the path to improved sustainability in data centers by modernizing aging infrastructure, uncovering challenges, three effective approaches, and specific examples outlined...

Image courtesy of EXFO

Navigating the Future: Upgrading Networks in Data Centers for 400G  

Nicholas Cole, Data Center Solution Manager at EXFO, explains why the journey towards 400G and beyond is not merely about keeping pace but also ensuring that every step forward...

White Papers

Dcf Imdcwp Cover 2023 01 11 17 19 43

Infrastructure Planning Report - EMEA - Frankfurt

Jan. 11, 2023
In this white paper, Iron Mountain Data Centers provides an overview of the German colocation market. It explores strengths and weaknesses of the market as well as the latest ...