Software Developer Kit
Free of charge, the Arm NN SDK is a set of open-source Linux software tools that enables machine learning workloads on power-efficient devices. This inference engine provides a bridge between existing neural network frameworks and power-efficient Arm Cortex-A CPUs, Arm Mali GPUs and Ethos NPUs.
Enables rapid application development through the support of commonly used frameworks such as TensorFlow, Caffe, and ONNX.
Use the Compute Library to target programmable cores, such as Cortex-A CPUs and Mali GPUs, as efficiently as possible.
Arm NN works with NNAPI, Google’s interface for accelerating neural networks on Android devices, to target Arm Mali GPUs and Arm Ethos NPUs, enabling exponential performance boosts.
The new class of ultra-efficient machine learning processors is purpose-built to redefine device capabilities and transform our lives.
Interested in building machine learning workloads on power-efficient devices? Talk with Arm experts to learn more.
The Cortex-A processor series is designed for complex compute tasks, such as hosting a rich operating system platform and supporting multiple software applications.
Mali Graphics Processors
Including both graphics and GPU Compute technology, Mali GPUs offer a diverse selection of scalable solutions for low-power to high-performance smartphones, tablets, and DTVs.
The Arm Ethos processor series delivers the highest throughput and efficiency in the lowest area for Machine Learning inference from cloud to edge to endpoint.
This software library is a collection of low-level functions optimized for Arm CPU and GPU architectures targeting image processing, computer vision, and machine learning. It is available free of charge under a permissive MIT open source license.
Open Source Accelerates Adoption of ML
To support the Machine Intelligence Initiative by Linaro, Arm has donated Arm NN, our open-source network machine learning (ML) software. We believe that open source collaboration with Linaro and other ecosystem partners helps reduce fragmentation and minimizes duplication of efforts. Through this initiative, SoC and NN vendors can continue to differentiate with their key competitive advantage. Learn more about the Linaro Machine Learning Initiative in this video.