Software Developer Kit
Free of charge, the Arm NN SDK is a set of open-source Linux software tools that enables machine learning workloads on power-efficient devices. This inference engine provides a bridge between existing neural network frameworks and power-efficient Arm Cortex-A CPUs, Arm Mali GPUs and the Arm Machine Learning processor.
Interested in building machine learning workloads on power-efficient devices? Talk with Arm experts to learn more.
The Cortex-A processor series is designed for complex compute tasks on PC laptops, such as hosting a rich operating system platform and supporting multiple software applications and embedded designs. Cortex-A CPUs power intelligent solutions, from edge to cloud, for next-generation experiences.
Mali Graphics Processors
Including both graphics and GPU Compute technology, Mali GPUs offer a diverse selection of scalable solutions for low-power to high-performance smartphones, tablets, and DTVs.
Machine Learning Processor
Based on a new, class-leading architecture, the Arm ML processor provides best-in-class performance and energy efficiency. Its optimized design enables new features, enhances user experience and delivers innovative applications for a wide array of market segments including mobile, IoT, embedded, automotive, and infrastructure.
This software library is a collection of low-level functions optimized for Arm CPU and GPU architectures targeting image processing, computer vision, and machine learning. It is available free of charge under a permissive MIT open source license.
Open Source Accelerates Adoption of ML
To support the Machine Intelligence Initiative by Linaro, Arm has donated Arm NN, our open-source network machine learning (ML) software. We believe that open source collaboration with Linaro and other ecosystem partners helps reduce fragmentation and minimizes duplication of efforts. Through this initiative, SoC and NN vendors can continue to differentiate with their key competitive advantage. Learn more about the Linaro Machine Learning Initiative in this video.