Endangered Species Sparks Innovative Arm-based Solution
Around the world, wildlife officials are harnessing the power of digital technologies to safeguard endangered species and combat threats from poachers. Battery-powered cameras have emerged as a revolutionary tool in this regard, however, the demands of monitoring vast and diverse ecosystems are substantial.
Grovety’s mission was clear: enhance the efficiency and effectiveness of wildlife camera traps by incorporating artificial intelligence (AI) to improve body detection and classification accuracy. Its expertise in software design, AI integration, and dedication to addressing real-world challenges in wildlife conservation, combined with advanced Arm processors, yielded a groundbreaking prototype for wildlife camera traps to monitor and protect endangered species.
Minimal power consumption of its AI module on the Alif SoC.
Arm Cortex-A32 processors with clock speeds of up to 800MHz.
Reduced false positives in wildlife camera traps, improving efficiency and battery life.
Energy Efficiency and Battery Life
Grovety selected the Alif Semiconductor E7 processor, a member of the Ensemble family of embedded fusion processors and microcontrollers. It has ultra efficient Arm Cortex-A32 applications processors with a clock speed of up to 800MHz, along with Arm Cortex-M55 at 400MHz and 160MHz.
The E7 series is architected for power efficiency and long battery life, delivering high computation and machine learning/artificial intelligence capabilities, multilayered security, computer vision, and highly interactive human-machine interfaces. The E7 series features a secure enclave system and firewall control security unit, as well as a dedicated Arm Ethos-U55 NPU to run embedded machine learning.
Grovety estimated that the power consumption of its AI module on the Alif SoC would be minimal, requiring a typical 18650 lithium-ion battery cell with 2200 milliampere-hours to power the module for just over three weeks. Efforts aimed to significantly reduce false positives in wildlife camera traps, improving efficiency and battery life for conservationists.
Leveraging Apache TVM and Collaborative Development
Grovety leveraged Apache TVM – the open-source machine learning compiler framework for CPUs, GPUs, and machine learning accelerators — for fine-tuning its ML models. The team made significant contributions to TVM for Arm Ethos-U55 integration, enhancing its capabilities for floating computations to the NPU. This collaborative approach was central to the goal of creating an autonomous AI-powered device for object detection and classification, adaptable to various wildlife objects. Grovety used the open-source Vela compiler to compile a TensorFlow Lite for converting the microcontrollers neural network model into an optimized version that runs on the Arm Ethos-U NPU.
Discover the Real-World Impact of Edge AI
Explore in-depth use cases that show how edge AI is powering the next generation of IoT, solving real-world problems, driving faster decisions, and lowering costs through smarter operations right at the device level.
- Grovety builds AI camera traps for wildlife monitoring, using Arm processors for efficient, on-device detection.
- Arm Cortex-M55 and Ethos-U55 enable efficient, on-device object detection with low power draw.
- The Alif E7’s energy-efficient design extends battery life to support weeks of autonomous operation.
- Arm’s TVM and Vela tools help Grovety optimize TensorFlow Lite models for embedded ML.
- With Arm, Grovety reduce false positives and improved detection accuracy for wildlife conservation.