Ambarella Unveils World’s First Centrally Processed 4D Imaging Radar Architecture for Autonomous Mobility Systems
Ambarella, Inc. (NASDAQ: AMBA) has unveiled a groundbreaking centralized 4D imaging radar architecture utilizing its Oculii™ Adaptive AI Radar Software and 5nm CV3 AI Domain Control SoCs. This architecture allows unprecedented real-time processing and fusion of radar data with other sensor inputs, enhancing environmental perception for ADAS and autonomous driving applications. Ambarella's solution is designed to be significantly more power-efficient while offering high angular resolution and extended detection range, addressing growing demands in the automotive sector.
- Launch of centralized 4D imaging radar architecture enhances radar processing efficiency.
- Significantly reduced power consumption compared to competing technologies.
- High angular resolution of 0.5 degrees and long detection range of 500+ meters.
- Dynamic allocation of processing resources improves real-time sensor fusion.
- None.
Ambarella’s Oculii™ Adaptive AI Radar Software and Highly Efficient 5nm CV3 AI Domain Control SoCs Enable Central Processing and Fusion of Raw 4D Imaging Radar Data for First Time
SANTA CLARA, Calif., Dec. 06, 2022 (GLOBE NEWSWIRE) -- Ambarella, Inc. (NASDAQ: AMBA), an edge AI semiconductor company, today announced the world’s first centralized 4D imaging radar architecture, which allows both central processing of raw radar data and deep, low-level fusion with other sensor inputs—including cameras, lidar and ultrasonics. This breakthrough architecture provides greater environmental perception and safer path planning in AI-based ADAS and L2+ to L5 autonomous driving systems, as well as autonomous robotics. It features Ambarella’s Oculii™ radar technology, including the only AI software algorithms that dynamically adapt radar waveforms to the surrounding environment—providing high angular resolution of 0.5 degrees, an ultra-dense point cloud up to 10s of thousands of points per frame and a long detection range up to 500+ meters. All of this is achieved with an order of magnitude fewer antenna MIMO channels, which reduces the data bandwidth and achieves significantly lower power consumption than competing 4D imaging radars. Ambarella’s centralized 4D imaging radar with Oculii technology provides a flexible and high performance perception architecture that enables system integrators to future proof their radar designs.
Watch our short video here:
https://www.globenewswire.com/NewsRoom/AttachmentNg/fa354596-e725-4ec8-b0fc-fe7be3aae244
“There were ~100M radar units manufactured in 2021 for automotive ADAS,” explains Cédric Malaquin, Team Lead Analyst of RF activity at Yole Intelligence, part of Yole Group. “We expect this volume to grow 2.5-fold by 2027, given the more demanding regulations on safety and more advanced driving automation systems hitting the road. Indeed, from the current 1-3 radar sensors per car, OEMs will move to 5 radar sensors per car as a baseline (1). Besides, there is an exciting debate on the radar processing partitioning and many developments associated. One approach is centralized radar computing that will enable OEMs to offer significantly higher performance imaging radar systems and new ADAS/AD features while simultaneously optimizing the cost of radar sensing.”
To create this unique, cost-effective new architecture, Ambarella optimized the Oculii algorithms for its CV3 AI domain controller SoC family and added specific radar signal processing acceleration. The CV3’s industry-leading AI performance per watt offers the high compute and memory capacity needed to achieve high radar density, range and sensitivity. Additionally, a single CV3 can efficiently provide high-performance, real-time processing for perception, low-level sensor fusion and path planning, centrally and simultaneously, within autonomous vehicles and robots.
“No other semiconductor and software company has advanced in-house capabilities for both radar and camera technologies, as well as AI processing,” said Fermi Wang, President and CEO of Ambarella. “This expertise allowed us to create an unprecedented centralized architecture that combines our unique Oculii radar algorithms with the CV3’s industry-leading domain control performance per watt to efficiently enable new levels of AI perception, sensor fusion and path planning that will help realize the full potential of ADAS, autonomous driving and robotics.”
The data sets of competing 4D imaging radar technologies are too large to transport and process centrally. They generate multiple terabits per second of data per module, while consuming more than 20 watts of power per radar module, due to thousands of MIMO antennas used by each module to provide the high angular resolution required for 4D imaging radar. That is multiplied across the six or more radar modules required to cover a vehicle, making central processing impractical for other radar technologies, which must process radar data across thousands of antennas.
By applying AI software to dynamically adapt the radar waveforms generated with existing monolithic microwave integrated circuit (MMIC) devices, and using AI sparsification to create virtual antennas, Oculii technology reduces the antenna array for each processor-less MMIC radar head in this new architecture to 6 transmit x 8 receive. Overall, the number of MMICs is drastically reduced, while achieving an extremely high 0.5 degrees of joint azimuth and elevation angular resolution. Additionally, Ambarella’s centralized architecture consumes significantly less power, at the maximum duty cycle, and reduces the bandwidth for data transport by 6x, while eliminating the need for pre-filtered, edge processing and its resulting loss in sensor information.
This cost-effective, software-defined centralized architecture also enables dynamic allocation of the CV3’s processing resources, based on real-time conditions, both between sensor types and among sensors of the same type. For example, in extreme rainy conditions that diminish long-range camera data, the CV3 can shift some of its resources to improve radar inputs. Likewise, if it is raining while driving on a highway, the CV3 can focus on data coming from front-facing radar sensors to further extend the vehicle’s detection range while providing faster reaction times. This can’t be done with an edge-based architecture, where the radar data is being processed at each module, and where processing capacity is specified for worst-case scenarios and often goes underutilized.
These two different approaches to radar processing are summarized in the following table…
Competing Edge-Processed Radar | Ambarella’s Centralized Radar Processing |
Constant, repeated radar waveforms without regard for environmental conditions | Oculii™ AI software algorithms dynamically adapt radar waveforms to surrounding environment |
MMIC + edge radar processor in module | MMIC-only in “radar head” |
Radar detection processing in radar module | Radar detection processing in central processor |
Multiple terabits per second, per module of radar data (too large to transport and process centrally) | 6x bandwidth reduction for radar data transport |
1+ to 2 degree resolution | 0.5 degrees of joint azimuth and elevation angular resolution |
High power consumption, due to 1000s of antenna MIMO channels used by each radar module | Low power consumption, due to order of magnitude fewer antenna MIMO channels (6 transmit x 8 receive antennas in each processor-less MMIC radar head) |
No dynamic processing allocation (specified for worst-case scenarios) | Dynamic allocation of CV3’s processing resources, based on real-time conditions, between sensor types and among sensors of same type |
Slow processing speeds | CV3 is up to 100x faster than traditional edge radar processors |
CV3 marks the debut of Ambarella’s next-generation CVflow® architecture, with a neural vector processor and a general vector processor, which were both designed by Ambarella from the ground up to include radar-specific signal processing enhancements. These processors work in tandem to run the Oculii advanced radar perception software with far higher performance, including speeds up to 100x faster than traditional edge radar processors can achieve.
Additional benefits of this new centralized architecture include easier over-the-air (OTA) software updates, for continuous improvement and future proofing. In contrast, each edge radar module’s processor must be updated individually, after determining the processor and OS being used in each; whereas a single OTA update can be pushed to the CV3 SoC and aggregated across all of the system’s radar heads. These radar heads eliminate the need for a processor, which reduces costs for both the upfront bill of materials and in the event of damage from an accident (most radars are located behind the vehicle’s bumper). Additionally, many of the edge-processor radar modules deployed today never receive software updates because of this software complexity.
Target applications for the new centralized radar architecture include ADAS and level 2+ to level 5 autonomous vehicles, as well as autonomous mobile robots (AMRs) and automated guided vehicle (AGV) robots. These designs are streamlined by Ambarella’s unified and flexible software development environment, which provides automotive and robotics designers with a software-upgradable platform for scaling performance from ADAS and L2+ to L5.
Availability
This new centralized architecture will be demonstrated at Ambarella’s invitation-only event taking place during CES. Contact your Ambarella representative to schedule a meeting. For sampling and evaluation information on the Oculii AI radar technology and CV3 AI domain controller SoC family, contact Ambarella: https://www.ambarella.com/contact-us/.
About Ambarella
Ambarella’s products are used in a wide variety of human and computer vision applications, including video security, advanced driver assistance systems (ADAS), electronic mirror, drive recorder, driver/cabin monitoring, autonomous driving and robotics applications. Ambarella’s low-power systems-on-chip (SoCs) offer high-resolution video compression, advanced image processing and powerful deep neural network processing to enable intelligent perception, fusion and central processing systems to extract valuable data from high-resolution video and radar streams. For more information, please visit www.ambarella.com.
Ambarella Contacts
- Media contact: Eric Lawson, elawson@ambarella.com, +1 480-276-9572
- Investor contact: Louis Gerhardy, lgerhardy@ambarella.com, +1 408-636-2310
- Sales contact: https://www.ambarella.com/contact-us/
1. Source: Radar for Automotive report, Yole Intelligence, 2022
All brand names, product names, or trademarks belong to their respective holders. Ambarella reserves the right to alter product and service offerings, specifications, and pricing at any time without notice. © 2022 Ambarella. All rights reserved.
A photo accompanying this announcement is available at https://www.globenewswire.com/NewsRoom/AttachmentNg/34b95a6f-1a28-4f5a-83b7-d245b1980f12
FAQ
What is Ambarella's new radar architecture announced in December 2022?
How does Ambarella's radar technology improve automotive safety?
What are the benefits of Ambarella's new centralized radar processing?
What is Ambarella's stock symbol?