Members can download this article in PDF format.
The rise of autonomous vehicles (AVs) is poised to redefine the transportation landscape. In the transition from a futuristic concept to everyday reality, these driverless cars will have far-reaching effects on how the driver will interact with them. Eventually, once the glamor wears off, to be accepted by the masses, AVs will need to provide an experience as seamless as using your smartphone.
In general, several trends are emerging in the AV space, but the two main overarching directives are safety and convenience. Safety stands alone, but convenience has multiple components.
One trend is the evolution of advanced exteroceptive sensors such as solid-state and 3D LiDAR, and frequency-modulated continuous-wave (FMCW) radars, AI-integrated cameras, and ultrasonic. Similarly, the transformation continues in the advanced chip ecosystem, i.e., embedded processors with edge AI, intelligent application-specific integrated circuits (ASICs), purpose-built systems-on-chips (SoCs), and microcontrollers (MCUs).
Sponsored Resources:
On the safety side, there are chips designed specifically to meet the latency, power, and performance requirements of AVs according to functional-safety standards like NCAP and R79, ISO 26262, and AEC-Q100/200. These define various automotive safety integrity levels (ASILs) and specifications. They are generally implemented by dual-core lock-step MCUs. For systems that require lower safety integrity levels, e.g., ASIL B, a simpler and lower-cost combination of safety mechanisms, such as error correcting codes (ECC), and built-in self-tests (BIST) can be used.
Scalability is another objective. These devices must be applicable from the entry level to luxury vehicles.
Obviously, this comprises many components, but the main one is driver safety, which is largely supported by advanced driver-assistance systems (ADAS). In fact, by 2050, ADAS is projected to prevent as many as 37 million1 accidents alone.
ADAS is All About the Drive
ADAS is a complex and sophisticated data-acquisition and processing system that functions as the eyes and ears of the AV. The system consists of several essential components, including an electronic control unit (ECU), sensors, cameras, software, radar, LiDAR, artificial intelligence (AI), and interfaces.
Those elements work together to collect and analyze vast amounts of data which, when married to sensor fusion, creates an intelligent, self-driving vehicle.
Sensor Fusion and ADAS: The Ultimate Solution
Sensor fusion is an integral component of ADAS. It’s AI-enabled software that manages the complex data received from sensors. Together with advanced MCUs and SoCs, sensor fusion enhances ADAS capabilities, such as forward collision warning (FCW), autonomous emergency braking (AEB), collision avoidance, blind spot detection (BSD), self-parking, lane-keeping, and more. It enhances safety and convenience and improves the overall driving experience.
Advanced sensors are able to acquire data rich in content with extremely fine detail from radar, LiDAR, inertial sensors, ultrasonics, the Global Navigation Satellite System (GNSS), and more (Fig. 1). Sensor fusion collectively aggregates, analyzes, and weights these sensor streams.