Project Brainwave: Intel FPGAs Accelerate Microsoft’s AI Hardware

Aug. 23, 2017
This week Microsoft unveiled Project Brainwave, a deep learning acceleration platform the latest fruits of its collaboration with Intel on FPGA-based computing.

The artificial intelligence arms race continues, as the largest tech companies explore new ways accelerate AI workloads for cloud platforms. The appetite for more computing horsepower is following several tracks, with major investment in graphics processors (GPUs) as well as custom ASIC chips.

Microsoft has been a leader in using FPGAs (Field Programmable Gate Arrays) to accelerate its cloud and AI workloads. This week Microsoft unveiled Project Brainwave, a deep learning acceleration platform based on its collaboration with Intel on FPGA computing.

Microsoft says Project Brainwave represents a “major leap forward” in cloud-based deep learning performance, and intends to bring the technology to its Windows Azure cloud computing platform.

“We designed the system for real-time AI, which means the system processes requests as fast as it receives them, with ultra-low latency,” writes Doug Burger, a Microsoft Distinguished Engineer, in a blog post. “Real-time AI is becoming increasingly important as cloud infrastructures process live data streams, whether they be search queries, videos, sensor streams, or interactions with users.

Real-Time Deep Learning

Tuesday’s announcement Microsoft at the Hot Chips 2017 conference fleshed out the details on an approach that Microsoft described in broad terms at its Build user event in April.  Microsoft says its new approach, which it calls Hardware Microservices, will allow deep neural networks (DNNs) to run in the cloud without any software required, resulting in large advances in speed and efficiency.

FPGAs are semiconductors that can be reprogrammed to perform specialized computing tasks, allowing users to tailor compute power to specific workloads or applications. FPGAs can serve as coprocessors to accelerate CPU workloads, an approach that is used in supercomputing and HPC. Intel acquired new FPGA technology in its $16 billion acquisition of Altera in 2016.

“We exploit the flexibility of Intel FPGAs to incorporate new innovations rapidly, while offering performance comparable to, or greater than, many ASIC-based deep learning processing units,” said Burger.

Microsoft is using Intel Stratix 10 FPGAs as the hardware accelerator in its Brainwave platform. Microsoft describes its approach as using a “soft” DNN processing unit (or DPU), synthesized onto commercially available FPGAs. Microsoft says this approach provides flexibility and the ability to rapidly implement changes as AI technology advances.

Microsoft’s Project Brainwave hardware, which leverages Intel Stratix 10 FPGAs. (Photo: Microsoft)

“By attaching high-performance FPGAs directly to our datacenter network, we can serve DNNs as hardware microservices, where a DNN can be mapped to a pool of remote FPGAs and called by a server with no software in the loop,” Burger explained. “This system architecture both reduces latency, since the CPU does not need to process incoming requests, and allows very high throughput, with the FPGA processing requests as fast as the network can stream them.”

Project Brainwave, leveraging the Intel Stratix 10 technology, demonstrated over 39 teraflops of achieved performance on a single request, according to Microsoft and Intel. Brainwave is currently being used in Microsoft’s Bing search engine, but the company hopes to deploy it on its Azure cloud service.

“In the near future, we’ll detail when our Azure customers will be able to run their most complex deep learning models at record-setting performance,” said Burger. “With the Project Brainwave system incorporated at scale and available to our customers, Microsoft Azure will have industry-leading capabilities for real-time AI.”

Chinese Tech Firms Adopt AMD EPYC Servers

AMD also had news at the HotChips event, announcing that Chinese tech titans Tencent and JD.com plan to deploy its EPYC servers in their cloud and e-commerce operations. The wins are a signal of progress for AMD, which recently re-entered the data center market in earnest.

Tencent Cloud said that it plans to introduce AMD EPYC-based 2P cloud servers with up to 64 processor cores before the end of 2017. Jd.com also committed to future adoption of EPYC servers, but did not set a timeline.

“To continue as a leading provider of high-performance and high-value cloud services, Tencent needs to adopt the most advanced infrastructure and the chip industry’s latest achievements,” said Sage Zou, senior director of Tencent Cloud. “Tencent Cloud is continuously seeking more cores, more I/O interfaces, more secure hardware features and improved total cost of ownership for server hardware products.”

“By partnering with these market leaders, AMD is bringing choice and competition to one of the fastest growing technology markets in the world,” said Forrest Norrod, senior vice president and general manager, Enterprise, Embedded and Semi-Custom products, AMD.

About the Author

Rich Miller

I write about the places where the Internet lives, telling the story of data centers and the people who build them. I founded Data Center Knowledge, the data center industry's leading news site. Now I'm exploring the future of cloud computing at Data Center Frontier.

Sponsored Recommendations

NECA Manual of Labor Rates Chart

See how Champion Fiberglass compares to PVC, GRC and PVC-coated steel in installation.

Electrical Conduit Cost Savings: A Must-Have Guide for Engineers & Contractors

To help identify cost savings that don’t cut corners on quality, Champion Fiberglass developed a free resource for engineers and contractors.

Conduit Sweeps and Elbows for Data Centers and Utilities

Data Centers and Utilities projects require a large number of electrical conduit sweeps and elbows. Learn why Champion Fiberglass is the best supplier for these projects.

Prefabricated Conduit Duct Banks Enable Smooth and Safe Electrical Installation for a Data Center

Prefabricated conduit duct banks encourage a smooth, safe electrical conduit installation for a data center.

Black & Veatch
Source: Black & Veatch

The Benefits of Implementing Substations for Data Centers

Katie Muer, Distributed Infrastructure Solutions Portfolio Lead for Black & Veatch, explains why substations are becoming critical aspects of data center grid connection.

White Papers

Get the full report

The Data Center Human Element: Designing for Observability, Resiliency and Better Operations

March 31, 2022
To meet the new demands being placed on data centers, industry leaders must rethink the way they approach their environment, delivery model and how they can leverage the cloud...