New Chips, Software Shift Workloads From Cloud to Mobile Devices

July 10, 2017
Advances in hardware and software are bringing the computing power of the cloud into devices in our pockets, allowing smartphones and IoT devices to run AI neural networks and other data-intensive processing.

Our things are getting smarter and more powerful, bringing the computing power of the cloud into devices in our pockets. The trend is enabled by advances in hardware and software, as startups and cloud platforms alike seek to capitalize on the disruptive changes in the technology landscape.

The power of these new chips and devices will help shape America’s evolving IT infrastructure, moving more workloads and tasks to the very edge of the network. The Internet of Things (ioT) and artificial intelligence (AI) are bringing intelligence to mobile devices and industrial equipment, shifting computing power and algorithms to devices like smartphones and tablets, as well as appliances on factory floors and hospitals.

“You can now put more computational ability at the very edge of the network, essentially making it as if the cloud is in your back pocket,” said Ed Chan, Senior VP for Technology Strategy at Verizon, at a recent conference. “That’s kind of how we envision the way that 5G (next-generation connectivity) is going to change the world.”

The evolution of edge devices and “fog computing” – processing power near the perimeter of the network – will play a role in the geography of the data center industry, helping to deliver capacity to billions of devices and sensors.

This trend is expected to play a leading role in the Internet of Things, but is also emerging as a key strategy in artificial intelligence, providing the ability to run neural networks on smartphones. The capabilities of these devices will ripple beyond the fog layer, impacting the path of data traffic and location of workloads.

Want a Neural Network With that Smartphone?

New technologies like the Internet of Things, artificial intelligence (AI), autonomous vehicles and virtual reality will require data storage and computing power to become highly distributed.

A key element of all these technologies is analytics – the ability to process Big Data and extract value for businesses and consumers. This crunching of big data has historically been performed in the data center. As mobile devices add processing power and algorithms become more efficient, some analytics jobs are shifting to devices on the edge.

“Everything becomes a data center, because it has to,” said Scott Noteboom, founder and CEO of LitBit. “The majority of data center calculations and analytics will take place on the devices themselves.”

This has led to some impressive new hardware capabilities for mobile devices. At the recent O’Reilly Artificial Intelligence conference, startup Aipoly showed off a smartphone app that can run a convolutional neural network, a data-intensive AI process widely used for image recognition.

“Rather than running this in the cloud or large servers, we’re starting to run it on the mobile device,” said Aipoly co-founder Albert Rizzoli. “We wanted a system that could run in real time.”

Aipoly co-founder Albert Rizzoli demonstrates how the company’s app enables visually-impaired persons to use their smartphone camera for object recognition. Rizzoli presented at the O’Reilly AI conference in New York. (Photo: Rich Miller)

Aipoly allows the visually-impaired to use their smartphone as a sensor and guide. Users point their phone, and the Aipoly app recognizes objects and people and provides an audio description. It can recognize friends, identify products in a grocery shelf, and discern colors and shapes.

The Aipoly app requires real-time execution to be effective. Sending images to the cloud and back would take about two seconds, while Aipoly’s on-device technology can deliver results in 250 milliseconds – about one-tenth the time. Aipoly has developed a deep learning engine that runs AI algorithms more efficiently.

“Cameras are no longer just for photography,” said Rizzoli. “They’re becoming sensors. AI on low power devices is commoditized. You can essentially have a Pokedex in your hand serving as a digital assistant.”

What New Technology Means to the Data Center

At first glance, the improved processing power of mobile devices could be perceived as a threat to data centers, representing a paradigm shift that frees users from the need to process data in huge server farms. The advancements in device-based processing and “fog computing” – data-crunching that occurs on near-edge appliances and servers – bear close watching, given their potential to create a more distributed computing architecture.

Over the years, pundits have warned that new processing technology could hurt the data center business – most notably Wall Street analyst Jim Cramer, who in 2009 warned his viewers to sell Equinix because new Intel chips “would make the data center obsolete.” Stock in Equinix, which was $77 at the time, has since risen to $422 a share. That may explain why Cramer has experienced a conversion, and now regularly features data center stocks (especially Equinix).

In truth, this is a trend that has been in place for years, in a variety of formats. Chips and memory have been getting smaller and more powerful for decades. Virtualization and containerization have emerged to offer more efficient use of resources. Moore’s Law has not slowed the growth of the data center industry, even through major shifts to mobile and cloud platforms, illustrating Jevons Paradox (efficiency prompts more usage, not less).

Edge or Fog?

Leading hardware vendors and cloud service providers are deploying products to help customers run some AI analytics on devices or on-premises appliances. In many cases, these capabilities provide an initial layer of analysis, reducing the amount of data that must be sent across the network for further analytics.

“The cloud was going to solve all our problems, but now there’s too much data to send,” said Sastry Malladi, Chief Technology Officer of Foghorn Systems, which makes software for small-footprint devices. “How do you run software to ingest and process data on devices? We used to call that embedded computing.”

The language around distributed computing can be a jumble of terms that evolve, intersect and overlap. Here at Data Center Frontier, we use the term edge computing when writing about network distribution. As we’ve noted, “edge” can be defined in a number of ways, spanning everything from regional data centers to micro data centers at telecom towers to the endpoints themselves.

Fog computing is a popular term for those focused on the Internet of Things. The IoT will produce an enormous number of connected devices – between 20 billion and 50 billion, according to various estimates. Some of these devices, like autonomous cars and some factory equipment, will generate huge volumes of data that requires real-time action.

The term fog computing has been embraced by Cisco Systems to describe a layer of computing between endpoints and data centers. Cisco has teamed with Intel, Microsoft, Dell, ARM and Princeton University to form the OpenFog Consortium , which seeks to accelerate distributed computing through open architecture and reference frameworks.

In some instances, fog computing is distributing decision-making , helping customers operate in real-time without latency.

“The idea of doing everything in the cloud will not work,” said Philippe Fremont, VP of Technical Marketing at Avnet, which makes embedded components for IoT applications. “A lot of our suppliers are moving intelligence to the edge.”

Getting Mythic With Processing in Memory

The shift to a distributed network is driven by innovation in both hardware and software. An example is Mythic, an Austin-based startup making hardware to power AI on devices.

“Neural networks are incredibly powerful, and you’re putting them inside devices with resource constraints,” said Michael Henry, the CEO of Mythic, in a presentation at the O’Reilly AI conference. “When we do these intelligent tasks, we want to do them at the edge. But you can’t put a GPU in a Fitbit.”

How do you put more power in a device without generating heat and draining the battery? Mythic’s solution is to do processing in flash memory. “We don’t have processors,” said Henry. “We use the memory to do the processing. We are able to do matrix math inside a flash memory array. We’re making silicon for inference.”

There are several types of AI computing workloads. In training, the network learns a new capability from existing data. In inference, the system applies its capabilities to new data, using its training to identify patterns and perform tasks, usually much more quickly than humans could.

“Once you have your algorithm trained, that’s where we come in,” says Henry. “New hardware will let us run all sorts of algorithms at the source of the data.”

Mythic CEO Mike Henry discusses the company’s hardware during a presentation at the O’Reilly AI conference in New York. (Photo: Rich Miller)

ARM Holdings is taking a similar approach,  positioning its new chips to power AI processing on these edge devices.

“A cloud-centric approach is not an optimal long-term solution if we want to make the life-changing potential of AI ubiquitous and closer to the user for real-time inference and greater privacy,” writes Nandan Nayampally on the ARM blog. “ARM has a responsibility to rearchitect the compute experience for AI and other human-like compute experiences. To do this, we need to enable faster, more efficient and secure distributed intelligence between computing at the edge of the network and into the cloud.”

Another hardware startup focused on this space is Nebbiolo Technologies, which in February launched a fog computing platform featuring its FogNode hardware device and a fog software suite. The Nebbiolo suite is being used by robotics vendor KUKA to create a cloud-to-fog system to securely manage industrial robots.

The major hardware incumbents are also focusing more power on edge devices. Last year Intel acquired Movidius, a startup developing low-power coprocessors to provide computer vision for drones and virtual reality devices. Intel has marketed the E3900 series of its mobile Atom processor for IoT and fog computing applications. NVIDIA also has a contender in its Jetson platform, which is designed to bring GPU-accelerated parallel processing to mobile devices.

Software Suites Bring Cloud to the Edge

Cloud computing providers are deploying software to support this distributed computing. Microsoft has introduced Azure IoT Edge, a platform enabling cloud intelligence to run on IoT devices.

“At Microsoft we think there’s going to be a balance between the cloud and the IoT,” said Sam George, Director of Azure IoT at Microsoft. “Not all data will be sent to the cloud. With edge you’re taking logic you might have run in the cloud and now you’re deploying that logic down to the edge.”

Azure IoT Edge makes that logic available in Docker-style containers, allowing devices to act locally based on the data they generate, while also taking advantage of the cloud to configure and manage them.

At the recent IoT World conference, George discussed how IoT Edge has extended the edge to the factory floor for Sandvik Coromant, which makes manufacturing tools and machining solutions. Azure Edge IoT Suite collects and analyzes data from sensors embedded in all of the tools across the shop floor, monitoring every aspect of their performance, as well as the existence of any bottlenecks in the overall supply chain for manufacturing. Sandvik Coromant takes that analysis and makes recommendations on how to optimize the manufacturing process, and creates a predictive maintenance schedule that’s designed to help avoid unscheduled shutdowns.

By moving the logic from the cloud to the fog, Aazure IoT Edge allows Sandvik Coromant to reduce its round-trip response time from 2 seconds to 100 milliseconds.

Amazon Web Services has just rolled out AWS Greengrass, an IoT service that Greengrass enables developers to create “serverless” code in the cloud using AWS Lambda functions, and deploy them to devices for local execution of applications. AWS Greengrass can be programmed to filter device data and only transmit necessary information back to the cloud.

About the Author

Rich Miller

I write about the places where the Internet lives, telling the story of data centers and the people who build them. I founded Data Center Knowledge, the data center industry's leading news site. Now I'm exploring the future of cloud computing at Data Center Frontier.

Sponsored Recommendations

NECA Manual of Labor Rates Chart

See how Champion Fiberglass compares to PVC, GRC and PVC-coated steel in installation.

Electrical Conduit Cost Savings: A Must-Have Guide for Engineers & Contractors

To help identify cost savings that don’t cut corners on quality, Champion Fiberglass developed a free resource for engineers and contractors.

Conduit Sweeps and Elbows for Data Centers and Utilities

Data Centers and Utilities projects require a large number of electrical conduit sweeps and elbows. Learn why Champion Fiberglass is the best supplier for these projects.

Prefabricated Conduit Duct Banks Enable Smooth and Safe Electrical Installation for a Data Center

Prefabricated conduit duct banks encourage a smooth, safe electrical conduit installation for a data center.

Gorodenkoff/Shutterstock.com
Source: Gorodenkoff/Shutterstock.com

Breaking Free from Device Drivers: The Future of DCIM Solutions

Max Hamner, Research and Development Engineer at Modius, explains the benefits of protocol-based DCIM solutions.

White Papers

Download the full report.

PCIe® 6.0: Testing for a New Generation

Aug. 1, 2021
This white paper from Anritsu outlines the enhanced PCIe 6.0 technologies, such as PAM4, Forward Error Correction (FEC) and link equalization. It also provides guidelines on selecting...