The Raspberry Pi Tackles HPC With 750-Unit Cluster

Nov. 14, 2017
Researchers at Los Alamos National Laboratory have combined 750 Raspberry Pi systems to create a low-budget system to test HPC software.

You may not think of the Raspberry Pi as a candidate for high performance computing. But if you pack enough of them into a chassis, you can create an inexpensive, low-energy system to test software for deployment on petascale systems.

That’s what researchers at Los Alamos National Laboratory have done, working with HPC vendors BitScope and SICORP to build a cluster of 750 Raspberry Pi systems. They say the system, which is being demonstrated at this week’s SC17 conference, can save enormous amounts of money for researchers testing HPC applications.

“It’s not like you can keep a petascale machine around for R&D work in scalable systems software,” said Gary Grider, leader of the High Performance Computing Division at Los Alamos National Laboratory, home of the Trinity supercomputer. “The Raspberry Pi modules let developers figure out how to write this software and get it to work reliably without having a dedicated testbed of the same size, which would cost a quarter billion dollars and use 25 megawatts of electricity.”

The Raspberry Pi is a credit-card sized computer that can be connected to a keyboard and TV to do just about anything a typical desktop computer can do. It was developed by the Raspberry Pi Foundation in the United Kingdom to help spread computing in education and in developing countries. The price for a basic model starts at about $25 and consumes only a handful of watts of power.

Seeking a cost-effective solution to the challenges facing HPC systems software developers, Grider said, he “suddenly realized the Raspberry Pi was an inexpensive computer using 2 to 3 watts that you could use to build a several-thousand-node system large enough to provide a low-cost, low-power testbed to enable this R&D.” But he was unable to locate a suitable densely packaged Raspberry Pi system on the market.

“It was just people building clusters with Tinker Toys and Legos,” said Grider, who turned to SICORP of Albuquerque, N.M., to collaborate on a solution. Then they jointly worked with BitScope of Australia to develop easily scaled rack-mounted units.

The BitScope system consists of five rack-mounted Pi Cluster Modules, each with 150 four-core nodes of Raspberry Pi ARM processor boards. They are fully integrated with network switching infrastructure. With a total of 750 CPUs or 3,000 cores working together, the system gives developers exclusive time on an inexpensive but highly parallelized platform for test and validation of scalable systems software technologies.

“Having worked with Raspberry Pi for quite some time, I’ve long thought it the ideal candidate to build low-cost cloud and cluster computing solutions for research and education,” said Bruce Tulloch, CEO of BitScope. “When SICORP approached us with Gary’s plans, we jumped at the opportunity to prove the concept.”

Check out the SICORP web site for additional details.

About the Author

Rich Miller

I write about the places where the Internet lives, telling the story of data centers and the people who build them. I founded Data Center Knowledge, the data center industry's leading news site. Now I'm exploring the future of cloud computing at Data Center Frontier.

Sponsored Recommendations

Get Utility Project Solutions

Lightweight, durable fiberglass conduit provides engineering benefits, performance and drives savings for successful utility project outcomes.

Guide to Environmental Sustainability Metrics for Data Centers

Unlock the power of Environmental, Social, and Governance (ESG) reporting in the data center industry with our comprehensive guide, proposing 28 key metrics across five categories...

The AI Disruption: Challenges and Guidance for Data Center Design

From large training clusters to small edge inference servers, AI is becoming a larger percentage of data center workloads. Learn more.

A better approach to boost data center capacity – Supply capacity agreements

Explore a transformative approach to data center capacity planning with insights on supply capacity agreements, addressing the impact of COVID-19, the AI race, and the evolving...

Photon photo/Shutterstock.com

In the Age of Data Centers, Our Connected Future Still Needs Edge Computing

David Wood, Senior Product Manager – Edge Computing at nVent, explains why edge computing will play a pivotal role in shaping the future of technology.

White Papers

Get the full report

Hybrid Cloud

Nov. 14, 2021
It’s not a matter of choosing or favoring either data center, cloud, or colocation, but selecting the most appropriate tool for the application, workload and desired outcome. ...