Software Developer Kit

Free of charge, the Arm NN SDK is a set of open-source Linux software tools that enables machine learning workloads on power-efficient devices. This inference engine provides a bridge between existing neural network frameworks and power-efficient Arm Cortex CPUs, Arm Mali GPUs and the Arm Machine Learning processor.

Features and Benefits
NN Framework Translation 

Enables the accurate translation of existing NN frameworks, such as TensorFlow and Caffe, so they can run smoothly and without modification.

Effective Targeting of CPUs and GPUs 

Use the Compute Library to target programmable cores, such as Cortex-A CPUs and Mali GPUs, as efficiently as possible.

Arm NN for Android 

Arm NN for NNAPI, Google’s interface for accelerating neural networks on Android devices, provides a Hardware Abstraction Layer (HAL) that targets Mali GPUs and gives more than four times performance boost.

Comprehensive Support

Includes support for the Arm Machine Learning processor and, via CMSIS-NN, Cortex-M CPUs.

Artificial Intelligence

The new class of ultra-efficient machine learning and object detection processors is purpose-built to redefine device capabilities and transform our lives. 

Learn More
Talk with an Expert

Interested in building machine learning workloads on power-efficient devices? Talk with Arm experts to learn more.

Contact Us
Related Products and Services
Explore More Options and Features
Machine Learning Processor

Arm Machine Learning Processor 

An optimized, ground-up processor design for machine learning.

Compute Library

Compute Library

This software library is a collection of low-level functions optimized for Arm CPU and GPU architectures targeting image processing, computer vision, and machine learning. It is available free of charge under a permissive MIT open source license.

Arm NN Resources

Project Trillium

Project Trillium is a suite of Arm IP designed to deliver scalable ML and neural network functionality at any point on the performance curve, from sensors, to mobile, and beyond. Gain insights on how ML processing requirements vary significantly according to workload and let Arm experts help you navigate the path. 


Visit Arm Developer