Toggle Search
   Arm Enables

Edge AI: From the art of the possible to the art of the tangible

Life-changing AI is finding its way into more and more edge devices, thanks to CPU-based machine learning technology

Surfer in rough seas

Two teenagers have been dragged out to sea by a vicious riptide and are struggling desperately for their lives in the heavy surf a hundred yards off the coast of New South Wales in Australia.

Suddenly above them appears a drone. It pauses and then drops an object that opens and inflates as it splashes into the water next to one of the swimmers. It’s a long, yellow flotation device, stretching itself out as if to grab a hold of each of the teenagers. The swimmers cling to the device and move safely to shore.

This scary episode really happened and is all the more notable for this reason: The drone was in the air as part of a government program that uses artificial intelligence (AI) to hunt for deadly sharks, using image recognition to detect whether an object is animal or a human. Lifeguards had received a distress call and redirected the drone to find and rescue the swimmers.

This is the new face of AI deployed ‘at the edge’ – using on-board computing rather than pushing data to the cloud for analysis and inference. It’s fast, super-efficient and doesn’t rely on a data connection to perform the AI compute.

A revolution at the edge

ArmAI and ML powered Drone

To date, much of the conversation about AI has been around the use of huge data centers to process and analyze data that’s then returned to the edge for action. But developers are increasingly deploying edge AI for reasons including faster response time, privacy and security and efficiency. And they’re doing so without having to figure out how to put a data center in your pocket, if you will. The key has come in the way edge AI systems are being built – with a focus on the anchor point that has been propelling advanced computing for decades: the processor, or central processing unit (CPU).

The research firm IDC estimates that in 2017 324 million edge devices used some form of AI (inference or training). The vast majority of AI growth is in inference, where machine learning (ML) algorithms are predominantly being run on CPUs.

Backing that up, Arm recently commissioned an AI developer survey that found a majority (42 percent) of the respondents are doing the bulk of their edge AI and ML compute today on CPUs. We received responses from 350 developers working across consumer, business and industrial AI applications. These are engineers at the sharp end of AI compute – designing drones to save drowning swimmers, smart inhalers to improve the quality of life for asthma sufferers and mobile phones that take one look at your face and know it’s you.

To explore these design trends more deeply, we’ve published a whitepaper, “What’s Powering Artificial Intelligence”. It’s a must-read for any developer or engineer looking to explore the possibilities to deploying life-changing AI and ML solutions at the edge.

Back to top