A basic requirement for sophisticated advanced driver assistance systems and automated driving is the ability to reliably perceive vehicle surroundings and evaluate them accurately on-the-fly. The international technology company Continental is working on the next generation of an environment model that will deliver a seamless, true-to-life, 360-degree view of the entire vehicle surroundings. To allow automated vehicles to assume control from drivers, the vehicle must continuously acquire, process and interpret data while acquiring and building up contextual knowledge. This is the only way to achieve sophisticated levels of automated driving that are able to master anything from straightforward freeway driving to the highly complex urban environment. A reliable environment model requires a range of information, for example from other traffic participants, static objects, the vehicle’s own precise location and of traffic control measures.
“In order for the system to acquire this information step-by-step, a range of sensors such as radars, cameras and Surround View systems are needed. The aim is to achieve an understanding of the vehicle’s surroundings, which is as good as or better than a person’s own understanding. More range, more sensors and the combination of acquired data with powerful computer systems will help to sharpen the view and is the key to achieving a consistent view of our surroundings,” said Karl Haupt, Head of Continental’s Advanced Driver Assistance Systems business unit.
Since each of the different surroundings sensors – whether the radar, camera, or Surround View system – have their own physical strengths and weaknesses, some applications can see them reaching the limits of their capabilities. In addition to extra, for example backend information, other sensors are needed for enhancing reliability and robustness as well as for further redundancy.
“This is why we are working on a High Resolution 3D Flash LIDAR, which is ideal for fulfilling the strict requirements regarding vehicle surroundings monitoring. The sensor captures and processes real-time 3D machine vision and does not contain any mechanical components,” Haupt said.
The data from the various sensors can be processed in either the individual sensor or on a central control unit, constructing a high-precision environment model of the vehicle’s surroundings. The greater the data volume to be processed and analyzed, the more computing power is needed. This, in turn, drives the need for control units that are more powerful than the ones currently in use today to construct and manage the environment model. The environment model represents an intermediate software layer between the individual sensors and the different applications. This layer contains data fusion and planning algorithms designed to enhance accuracy and reliability and to expand the field of vision from the individual sensors and acts as an abstraction layer with respect to the different functions. Acting as a central point for evaluating and interpreting all the information acquired is Continental’s Assisted & Automated Driving Control Unit, which generates the environment model at more than 50 times per second.
From space applications to the road: High Resolution 3D Flash LIDAR
The High Resolution 3D Flash LIDAR is a core component for generating a comprehensive 3D environment model.
“The technology, which is already been deployed for space operations, provides a significantly more comprehensive and detailed 3D view of the entire vehicle surroundings – both during the day and at night – and works reliably even in adverse weather conditions,” said Arnaud Lagandré, Head of Continental’s High Resolution 3D Flash LIDAR segment.
The 3D Flash LIDAR expands Continental’s portfolio of surroundings sensors for advanced driver assistance systems to support highly and fully automated driving in combination with other sensors. Series production is scheduled to start in 2020.
Compared with many scanner components used today, the High Resolution 3D Flash LIDAR comprises just two key components: A laser as a transmission source – like the flash of a camera – that illuminates the vehicle surroundings up to distances beyond 200 meters; and a highly integrated receptor chip. Similar in concept to a sensor chip in a digital camera, except that it is capable of recording the laser pulse transit time and record the reflected light on each pixel, which corresponds to the range from the objects. This simple yet highly efficient method allows highly accurate and distortion-free images of the surroundings to be generated with every laser flash.
“In this way, a complete 3D model of the vehicle surroundings nearby or over 200 meters away and as close as a few centimeters, is constructed in just 1.32 microseconds, 30 times per second. In addition, the distance to individual objects is also accurately measured,” said Lagandré. “The low complexity and high industrial feasibility mean that we can efficiently install multiple sensors all around the vehicle, thereby enabling us to generate complete, real-time, 360 degrees images of the vehicle surroundings.”
Central computer with strictest safety requirements for automated driving
The Assisted & Automated Driving Control Unit is used for evaluating and interpreting the acquired data and ultimately constructing a comprehensive surroundings model. It is a central control unit comprising a network of multiple heterogeneous processing units. The control unit also plays a key role in interconnecting electronic chassis and safety systems. This increases the functional scope by interconnecting systems that used to operate in isolation. Since intervention decisions are centrally coordinated, simultaneous interventions in different systems harmonize perfectly with each other.
“For Continental, the Assisted & Automated Driving Control Unit is a central element for implementing the required functional safety architecture and, at the same time, a host for the central environment descriptions and driving functions needed for automated driving,” said Michael Zydek, Head of the Assisted & Automated Driving Control Unit product group in the Advanced Driver Assistance Systems business unit.
The goal is to offer a scalable product family for assisted and automated driving that meets the most stringent of safety requirements (ASIL D) by 2019. The control unit will be equipped with various connections for Ethernet and Low Voltage Differential Signaling (LVDS) to manage the required flow of data.
“During development, we distinguish between an Assisted Driving Control Unit and an Automated Driving Control Unit. The first is a scalable control unit module for advanced driver assistance systems that, for each equipment level, offers a complete, cost-optimized package comprising sensors and a control unit. The control unit for automated driving is a powerful computer that meets the requirements of highly automated driving and focuses on specific digital structures for the environment model, computer for ASIL D, and real-time performance,” said Zydek.