Back to Results

Plumerai is making deep learning tiny and radically more efficient to enable inference on small, cheap and low-power hardware. This makes it possible to embed intelligent, battery-powered sensors everywhere and create a future where we know more things faster. Our research has been published at top conferences (NeurIPS, MLSys) and we are backed by world-class investors.




We are able to unlock huge value on the edge by using the most efficient form of deep learning: Binarized Neural Networks (BNNs). We combine our highly-optimized inference software with our collection of BNN models, trained on our proprietary datasets. We provide a turnkey solution for person detection on off-the-shelf Arm Cortex-M and Arm Cortex-A chips and we will be using our technology for speech recognition, hand gesture recognition, gaze detection and pose estimation.

Solution Briefs

  • thumbnail: Plumerai - Person Detection on Arm Cortex-M
    Plumerai - Person Detection on Arm Cortex-M

    Plumerai provides a highly-accurate and efficient turnkey software solution for person detection on Arm Cortex-M.

    Learn More

Insights

  • Squeezing data center AI into a tiny microcontroller Blog
    Squeezing data center AI into a tiny microcontroller

    What’s the point of making deep learning tiny if the resulting system misses key detections? Our person detection model fits on a tiny Arm Cortex-M7 microcontroller and is as accurate as Google’s much larger EfficientDet-D4 running on an NVIDIA GPU.

    Learn More
  • The world’s fastest deep learning inference software for Arm Cortex-M Blog
    The world’s fastest deep learning inference software for Arm Cortex-M

    Our inference software for Arm Cortex-M microcontrollers is the fastest and most memory-efficient in the world. It has 40% lower latency and uses 49% less RAM than TensorFlow Lite for Microcontrollers kernels while retaining the same accuracy.

    Learn More
  • Great TinyML needs high-quality data Blog
    Great TinyML needs high-quality data

    The use of BNNs helps us reduce the required memory, the inference latency and energy consumption of our AI models, but there is something that we have been less vocal about that is at least as important for AI in the real world: high-quality data.

    Learn More