Free Cooling & Beyond – Data Center Efficiency at the Edge

Aug. 17, 2016
Earl Keisling, co-founder and CEO of Inertech LLC, the data center infrastructure technology division of Aligned Energy, discusses Data Center Efficiency at the Edge.

In this week’s Voices of the Industry Earl Keisling, co-founder and CEO of Inertech LLC, the data center infrastructure technology division of Aligned Energy, discusses Data Center Efficiency at the Edge.

Data center consumption of energy and water is rising. At the same time, cheap energy and plentiful water are becoming scarcer. It’s no longer just a sustainability issue, either: investors are looking at resource consumption as a risk factor for the companies they invest in.

Data centers’ resource consumption is rising not because data centers are getting less efficient (the recent report by the Center of Expertise for Energy Efficiency in Data Centers shows that data centers continue to improve their resource efficiency). Data center demand – driven by demand for Internet-enabled everything – is simply rising faster than efficiency gains.

Historically it has been hyper-scale companies like Facebook, Microsoft, Google, and others that have led the way in data center efficiency gains. At hyper-scale, efficiency gains have often been driven by locating data centers in climates that allow for free cooling. Free cooling, which is the use of outside air rather than a chiller plant, can dramatically reduce the energy needed to cool the data center.

Growing demand for data centers at the edge

But the geography of data center demand is changing. There is rising demand for computational power “at the edge” – among cloud service and content providers like Google, Netflix, Facebook, Amazon and others. They’re looking to deliver content with low latency to customers around the world. Without an edge data center, customers who want to access the content have to get it from the nearest city where it’s cached. The problem with that is distance increases latency, and low latency is becoming more important.

Gartner analyst Bob Gill called for a move to the edge last year in a report titled The Edge Manifesto. “The edge manifesto calls for the placement of content, compute and data center resources on the edge of the network, closer to concentrations of users,” Gill wrote. “This augmentation of the traditional centralized data center model ensures a better user experience demanded by digital business.”

But here’s the rub: in many cases, the drive to deliver content with low latency conflicts with efforts to raise data center efficiency. Traditional air-side free cooling is effective in some edge locations, but in many locations – where it gets too hot and/or too humid – free cooling is not effective, and water consumption surges.

Does that mean organizations have to choose between resource efficiency and latency? No. But having both requires a new approach to data center cooling.

A new approach to data center cooling

Our new approach was born out of the understanding that the data center cooling problem is actually a heat removal problem. So instead of blowing cold air into the data center, our technology (for which we won a 2016 Edison Award) removes the heat.[clickToTweet tweet=”Aligned Energy’s Earl Keisling: Edge computing and sustainability are not mutually exclusive. ” quote=”Aligned Energy’s Earl Keisling: Edge computing and sustainability are not mutually exclusive.”]

Rethinking the approach to data center cooling, for us, resulted in a close-coupled system that absorbs heat at its source, allowing hot air to rise and cooler, denser air to settle where it is needed. A heat sink draws the hot air from the servers, passes it across coils and “neutralizes” the heat without chilling it – sending 75-77°F air to the server inlets at the front of the enclosure.

We eliminated the chiller plant and the expansion valves of a traditional system and we significantly reduced the use of compression. A thermal hub exchanges the absorbed heat to the closed circuit loop and the heat is transported for rejection to the atmosphere. In contrast to traditional chillers, this system relies on free cooling most of the time, even in hot climates. When temperatures can’t support 100% free cooling, the system makes use of indirect evaporative cooling. The cycle consistently and effectively manages water and compression power consumption.

Aisle containment systems and cooling infrastructure inside the Aligned Data Centers R&D facility in Connecticut. (Image: Aligned Data Centers)

Balancing the efficiencies of free cooling with the realities of the climate in edge locations enables data centers to save up to 85% of their water consumption and up to 80% of their energy consumption. Even in hot and humid locations, companies can achieve annualized mechanical PUEs of 1.03-1.08 and WUEs of 0.4-0.6 – comparable to what they’re achieving in locations ideal for free cooling. So cloud service and content providers get the best of both worlds – reaching their customers around the world and keeping energy and water consumption low.

The same design elements that enable efficiency in any climate deliver other benefits important for edge deployments as well. The system enables just-in-time, hyper-scale deployments with significantly less construction risk. It supports dynamic densities from 1 kW a rack to 25 kW+. And its simplicity also drives 99.9999% reliability and makes it easy for ground crews in edge locations to service the system.

The Bottom Line

The vast majority of consumers are “at the edge.” So the continued success of cloud service and content providers depends on their ability to reach their customers with low latency, no matter where those customers are. And that means those hyper-scale companies need their hyper-scale data centers in Tier 1 cities, as well as smaller-scale data centers in edge locations around the world.

Now with innovations in cooling technology, and a model that replaces large costly infrastructure with scalable right size systems repeatable in all edge markets, the companies that have worked hard to drive resource efficiency in their hyper-scale data centers can get the same level of resource efficiency in their edge deployments – no matter the climate.

Earl Keisling is a co-founder and CEO of Inertech LLC, the data center infrastructure technology division of Aligned Energy. He has been awarded multiple patents and is a master mechanical engineer with more than 30 years of experience designing and building large, complex construction and infrastructure projects.

About the Author

Voices of the Industry

Our Voice of the Industry feature showcases guest articles on thought leadership from sponsors of Data Center Frontier. For more information, see our Voices of the Industry description and guidelines.

Stream Data Centers

Data Center Development in an AI World

AI has gone mainstream, and the impact is reverberating everywhere. Mike Licitra, Vice President of Solutions Architecture at Stream Data Centers, has been watching the trend ...

White Papers

Get the full report

Achieving Energy Efficiency Goals in Data Centers

April 15, 2022
One of the challenges global data centers currently face is the need to meet the increased processing and storage needs of their customers while also making their operations more...