Arm introduces new automotive image signal processor to advance adoption of driver assistance and automation technologies
By Chet Babla, VP of automotive, Automotive and IoT Line of Business, Arm
- Arm adds Mali-C78AE image signal processor (ISP) to its “AE” line of safety-capable IP suitable for ADAS and human vision applications
- Mali-C78AE combined with Cortex-A78AE CPU and Mali-G78AE GPU provides optimal ADAS vision pipeline
- Mobileye is first to license Mali-C78AE, in addition to Mali-G78AE, for its next-generation EyeQ technology
Advanced Driver Assist Systems (ADAS) have grown from a premium vehicle feature to a capability consumers now expect as standard in new vehicles. In parallel, the global chip shortage is making it clear to the automotive industry the criticality of silicon and electronics to the development and competitive positioning of its products. Drivers increasingly depend on ADAS applications such as collision avoidance, lane departure warnings and automated emergency braking, and vehicles increasingly rely on cameras positioned around the car to enable many of these features. Indeed, according to a recent report from Strategy Analytics the value of the automotive camera market is expected to grow by greater than 19% from 2020 to 2025, making it the most important sensor type in providing data needed for the vehicle to make decisions about its surroundings.
As the number and sophistication of vehicle cameras increases, so does the compute power needed to translate the high throughput of image data – efficiently and safely – into outputs that meet the varying requirements for machine and human vision. To enable new capabilities in ADAS and autonomous driving, the industry will need a new approach to image processing, and to address this, we have added the Mali-C78AE ISP to our portfolio of IP specifically developed to meet the performance and safety needs of automotive applications.
Safety first in both human and machine vision
ADAS features use multiple cameras to enable a variety of human and machine vision applications. For example, surround view systems use data from cameras around the vehicle to visually display to the driver information to help them make decisions while parking. Adaptive cruise control, on the other hand, directly uses camera data to interpret the environment and make decisions independent of the driver about vehicle control, such as applying the throttle or brake. Mali-C78AE is designed specifically to address both human and machine vision safety applications, and is able to process data from up to four real-time or 16 virtual cameras.
We know safety is paramount in ADAS, and I have spoken about this previously – a fault or failure in operation of an ADAS system could be dangerous, threatening the wellbeing of the driver, passengers, and other road users. Mali-C78AE was developed from the ground up with hardware safety mechanisms and diagnostic software features enabling system designers to meet ISO 26262 ASIL B functional safety requirements. Mali-C78AE aims to prevent or detect faults in a single camera frame that may result in incorrectly processed frame data. To do this, the ISP features over 380 fault detection circuits, continuous built in self-test, and can detect sensor and hardware faults of connected cameras.
Vision is data rich, and workload demanding
Equally as important to safety and user experience is processing speed which is a key element of the Mali-C78AE. It should take 150 milliseconds to acquire an image at the sensor, process it through the ISP then GPU, and display it on a screen for the driver; anything longer is noticeable to the driver when using parking assist, for example. In a machine vision application, a vehicle should not travel more than 250mm between a camera image being acquired and it being presented to the decision-making processing and anything longer means the machine vision system is too slow to react in driving situations where accurate and timely decisions are critical.
To enable drivers and machines to make the best-possible decision, ADAS cameras must collect the most relevant information possible from each frame. Mali-C78AE employs advanced noise reduction technology and dynamic range management to ensure each frame is clear and properly exposed by adjusting overly dark or bright areas of a frame. Mali-C78AE is able to perform real-time processing of camera data from up to four high-resolution-high-frame rate cameras, significantly reducing the memory, communications, and processing requirements, making for a more efficient system.
Today, to implement multiple ADAS functions it would require individual camera setups because cameras used for machine vision applications, such as lane departure warnings, do not produce images that are suitable for human vision, such as surround view. To reduce the cost of implementing multiple ADAS functions, Mali-C78AE enables camera sensors to be dual-purpose by downscaling and color-translating the outputs of sensors optimized for machine vision to create images adapted to the human eye. By avoiding duplication in cameras and their associated electronics and wiring, OEMs save on cost and complexity and therefore, enable wider deployment of camera-based ADAS features across a diverse range of car models providing a safer, better user experience for drivers.
A strong vision pipeline for ADAS adopted by industry pioneer
Mali-C78AE is an important element of specialized processing required for ADAS systems. Combined with Cortex-A78AE and Mali-G78AE, the addition of Mali-C78AE provides a full ADAS vision pipeline to optimize performance, minimize power consumption, and provide a consistent approach to functional safety.
Leading the charge in implementing the new Mali-C78AE ISP in the next generation of its EyeQ technology is Mobileye, a pioneer in automotive vision-safety technology. When Mobileye started development of the Mobileye EyeQ Ultra and EyeQ6H, it selected the Mali-C78AE to process image data efficiently and, by coupling it with the Mali-G78AE GPU, enable safety-capable, smooth, real-time intuitive graphics rendering capabilities needed to meet the application’s demanding requirements.
A strong vision pipeline is increasingly important to powering the next phase of mass market ADAS deployment. The vehicle is one of the most complex electronics-enabled devices consumers will buy, and it comes with several constraints the automotive industry must adhere to in order to continue to improve driver safety and user experience. ADAS features will rely on safe, flexible, power efficient, vision technology that can be easily scaled across different vehicle types, models, configurations, and price points. And this is exactly what Mobileye has done with its next generation EyeQ technology, deploying the specialized ISP and GPU processing using the Mali-C78AE and Mali-G78AE to meet the growing demands of parking-assistance and visualization workloads. I’m really excited to see how else Arm’s suite of “AE” processing technologies can open opportunities for future ADAS and automated driving applications.
Arm technology is defining the future of computing. Our energy-efficient processor designs and software platforms have enabled advanced computing in more than 240 billion chips and our technologies securely power products from the sensor to the smartphone and the supercomputer. Together with 1,000+ technology partners, we are enabling artificial intelligence to work everywhere, and in cybersecurity, we are delivering the foundation for trust in the digital world – from chip to cloud. The future is being built on Arm.
All information is provided "as is" and without warranty or representation. This document may be shared freely, attributed and unmodified. Arm is a registered trademark of Arm Limited (or its subsidiaries). All brands or product names are the property of their respective holders. © 1995-2023 Arm Group.