Radar, Lidar, and More

Which Sensors Do Autonomous Cars Need?

7 min
Lidar Valeo
Autonomous driving is unthinkable without high-tech sensors.

For new assistance systems and automated driving functions, the number, arrangement, type, and quality of sensors are crucial. In particular, radar, lidar, and video systems are currently making significant progress.

What people cannot achieve without contortion, cars of the present and future are expected to master effortlessly: the 360-degree perception of their surroundings. This is an indispensable prerequisite to reliably ensure automated and autonomous driving functions. Safe and functional systems depend on highly developed sensors and imaging processes that generate a realistic image of their environment. Current Advanced Driver Assistance Systems (ADAS) already use around a hundred such sensors.

Ultrasound - the constant factor in sensor technology

Ultrasound systems have been known for many years primarily as parking sensors at the front and rear of vehicles. In ultrasound, sound pulses are emitted and their travel time to the reflecting object is assessed. The technology reliably covers the narrowly measured short range even in poor visibility conditions and is considered mature. This has led to little being heard of major developments recently. But the impression is deceptive. The technology has been mature since its introduction in the 7-series BMW in 1991 and is now part of the standard equipment, but it still has a great future ahead of it, according to the supplier Valeo, which claims the introduction for itself more than 30 years ago.

For example, ultrasound is combined with surround cameras for parking. "Our ultrasound systems also provide important information for driving functions such as ACC Stop&Go or Blind Spot Detection," emphasizes a Valeo spokesperson to automotiveIT. In 2021, Valeo finally launched a new system generation with a completely new data interface between sensors and control unit on the market. It enables a variety of new possibilities for data analysis in the system software, including AI-based algorithms for the first time, according to Valeo. This leads to significant performance improvements as well as new functions and makes software-based system developments possible for many years. Valeo's driver assistance systems are manufactured in the Wemding plant, where camera systems, electronic components, as well as ultrasound, radar, and lidar sensors are produced on 21,400 square meters of pure production space.

What types of sensors do autonomous cars need?

Ultrasound: Assesses the travel time of emitted sound pulses to the reflecting object. The technology covers a narrowly defined short range.

Radio Detection and Ranging - Radar: This electromagnetic sensor technology evaluates the echo of emitted radio waves.

Light Detection and Ranging - Lidar: A laser-based system whose photo sensors capture the environment solely with the help of light.

Camera: High video resolutions are combined with image processing based on artificial intelligence to distinguish objects.

Radar systems are becoming more powerful

In the electromagnetic sensor technology radar, emitted radio waves scan objects, and the echo of an object is evaluated. They are suitable for measuring different speeds and distances. They have been considered less suitable for the classification of objects. Radar inherently offers good possibilities to determine the distance to a vehicle ahead, for example, for the automatic emergency braking function. For more complex tasks, the 77-GHz technology with a radio frequency bandwidth of one gigahertz has become established. It offers a significantly higher resolution of objects than 24-GHz radar sensors. Therefore, 77 GHz has been used for some time not only for long-range but also for short-range radars.

New developments in radar could give the technology a significant boost among sensor types. 4D Imaging Radar can now capture the environment in detail by detecting, separating, and then classifying objects. With this expansion, radar, according to some experts, has the potential to advance far into the segment of capabilities previously reserved for Lidar. Compared to the laser-based alternative, they are considered more cost-effective and more robust in poor lighting and weather conditions.

Lidar offers high-resolution near-field detection in real-time

Light Detection and Ranging (Lidar) is particularly prevalent in generating high-resolution 3D information in real-time. According to Valeo, the corresponding market will grow to over 50 billion US dollars by 2030. Currently, two methods are in focus regarding Lidar technology: long-term measurement, known as Time of Flight (TOF), and Frequency-Modulated Continuous Wave (FMCW), a method that uses frequency modulation.

In time-of-flight (TOF) measurement, laser pulses are emitted, reflected by objects, and picked up again by a detector. Distances of individual points are recorded based on the length of the signal. However, the widely used system also has limitations, as explained by the provider Blickfeld in a blog post about the different Lidar systems. The light, bundled in favor of range, is subject to clear limits regarding eye safety.

Alternatively, in the FMCW method, a continuous laser beam can be emitted and the distance of objects can be recorded through its frequency modulation. "The emitted laser beam is repeatedly modulated and 'chirped', meaning the frequency of the signal is repeatedly changed," explains Blickfeld CEO Mathias Müller. As soon as an object reflects the beam, the frequency of the light is affected compared to the time of emission. "Therefore, when received by the detector, the returning light is mixed with the just-emitted light and the frequency difference is measured," Müller explains. The intermediate frequency thus recorded provides information about distances, and with the help of the Doppler effect, the speed of objects can also be detected.

But is frequency modulation fundamentally superior to the time-of-flight principle? "You can't say that across the board. Both methods have strengths and weaknesses," Müller cautions. The fact that ToF has been used in lidar sensors for many years, while FMCW is still in its infancy, is a significant difference in terms of maturity. FMCW is complex and currently still very expensive because it places more specific demands on the laser source. Georg List, Vice President Corporate Strategy at AVL, also cannot define a clearly superior system, as sensor technology is evolving very quickly. The engineering expert foresees a coexistence of many sensor systems in the future.

New camera systems offer scene understanding thanks to AI

Cameras today work with very high resolutions. Image processing can accurately assess and distinguish objects. This can also be trained with the help of artificial intelligence. Limiting factors are contamination of the optics as well as darkness and glare. It is emphasized at Bosch that an automated vehicle must have all the capabilities that a human needs for driving. According to the supplier, this includes perceiving the environment, making decisions, as well as accelerating, braking, and steering.

With its multifunction camera, Bosch combines classic image processing algorithms with methods of artificial intelligence (AI). The camera understands and interprets what it sees thanks to AI, according to the supplier. A package of hardware, software, and services is intended to serve for highly accurate self-localization. As Bosch reported, the Vehicle Motion and Position Sensor (VMPS) uses signals from satellite navigation for precise position determination and supplements these with data from a correction service as well as information from steering angle and wheel speed sensors. The cloud-based map service Bosch Road Signature uses data from radar and video sensors as well as vehicle movement data to create additional layers for high-resolution maps.

The combination of different sensors creates a complete picture

Since driving at certain levels of automation cannot be implemented with just one technology or another, a combination of different sensor types is required. The importance of integration is also demonstrated by an initiative from Continental: After a few months of collaboration with Lidar expert AEye, the technology company announced the integration of AEye's long-range Lidar technology into its full sensor stack solution. This creates the first full-stack system in automotive quality for automated and autonomous driving applications from Level 2+ to Level 4, according to the supplier's experts.

The extensive information from different sensors must be cleverly bundled, processed, and implemented in a central computer in the vehicle in a small space. ZF has been working on a high-performance system in this regard for some time, the so-called ProAI, which the company describes as a supercomputer. According to experts, ProAI is suitable for all levels of automated driving from Level 2 to 5. Currently, the system offers a performance designed for Level 2+ of 20 Tera-OPS, but it can be scaled up to 1,000 TOPS and thus be used from Level 3. With the help of graphics processor-controlled 360-degree fusion, the system is intended to bring together all sensor data in automated vehicles and take over the entire image processing.

The supercomputer family is complemented by the Vehicle Motion Domain Controller VMD, a specialized control unit. According to ZF, VMD can control all vehicle movements and caters to the trend of software-defined vehicles with real-time functions and applications with a high-performance threshold of 55,000 DMIPS (Dhrystone million instructions per second).

At CES 2022, Blickfeld presented a so-called seamless integration concept, which aims to emphasize brand-specific characteristics through an intelligent arrangement of individual elements. The company relies on Vision Mini sensors in its concept, which are intended to allow seamless integration and subtle incorporation into the vehicle at various mounting positions such as headlights, taillights, and side mirrors. The arrangement enables a complete all-around view and thus represents a crucial component on the way to safe driver-assisted and autonomous mobility, according to Blickfeld. The company speaks of an almost invisible embedding of Lidar sensors.

Continental Monitoring
Since 2024, the European Commission has mandated driver and vehicle monitoring systems in the type approval requirements of the GSR for new registrations.

Driver Monitoring Essential Until Autonomy

The path to full driving autonomy, where humans are completely relieved of responsibility for their vehicle, is still long. In the stages of (partial) automation, the automobile and driver must be able to pass the baton to each other in an unmistakable juggling act. For this purpose, so-called driver monitoring systems are used. Currently, it is mostly about detecting fatigue using cameras. For further progress, however, systems must be able to interpret the driver's activities in context beyond mere image recognition.

From 2024, the European Commission will require driver and vehicle monitoring systems in the type approval requirements of the GSR for new registrations to detect, for example, driver fatigue or lack of attention. These far-reaching legislative changes are accompanied by another regulatory driver: The Euro NCAP organization will reward the installation of interior camera systems with points starting in 2023.

Of course, this is also known at the supplier Continental, whose business unit Vehicle Networking and Information (VNI) presented the development of an integrated solution for interior sensor technology in mid-October 2021. With the real-time object detection of the entire vehicle interior, they go beyond mere driver monitoring and thus offer another building block for future mobility models such as automated or autonomous driving.

This article was first published at automotiveit.eu