AutotööstusRadarTechnology research

Introduction to self-driving vehicles' sensors: Radars

Self-driving cars require sensors to perceive the surrounding environment. Radar, in addition to LIDAR, ultra sound sensors and video cameras, are the most commonly found sensing solutions in cars today. This belief in radar is reinforced by the newfound trend by Tesla in self-driving cars, to use radar as its main sensing solution instead of LIDAR.  This confidence comes from experience: all safety systems used in vehicles today are based on radar. The most common usage examples being collision warning, parking-assist, adaptive cruise control systems.

The versatility of radar is mainly the result of its resistance to a variety of weather conditions, such as heavy rain and snow, where LIDAR and cameras typically struggle.  This confidence in the technology is further bolstered by the growing trend of using radars in safety systems to fight the peculiarities of weather, where others struggle. The goal of this blog post is therefore to introduce this capability by introducing the working principles of automotive radars and its use cases in the industry to exemplify its practical usefulness.

Radars, in the context of self-driving cars, are used to create an internal representation of the world to gather information to make decisions. This information gathering is usually represented as a map of the surrounding environment, taking into account all objects in order to observe changes in them. This results in elevated safety for both the passenger of the self-driving car and the other members of traffic. The safety systems, mentioned previously, have found use in self-driving cars for the same reasons they have in regular cars.

https://i.pinimg.com/originals/c5/65/13/c5651397491c6d9809e02e24cad1924f.jpg
Most widely used sensors in vehicles. Source: Pinterest

Before shedding light on the use cases of radars, a overview of the technology itself should be given. Radars, like LIDAR and video cameras, use the propagation of electromagnetical (EM) waves to sense the environment. Visible light and radio waves are the most common examples of EM waves that are used in various industries, the only difference being their wavelengths.

But how does radar use EM waves to get information about its surrounding environment? Simply noting that the radar EM waves have different wavelength, compared to the conventional cameras and human eyes that use visible light, is very cryptic. This does not lend itself well to imagining how the radar depicts the world. In reality it is simple to visualize if the principles of the technology are known. Radar, similarly to LIDAR, measures the range and angle of arrival in its field of view. This is done by using the time of flight and speed of the transmitted EM wave. The transmitted wave is reflected by objects in its path, which then propagates back to the sensor for analysis.

sonarworking
Wave propagation on the example of a sonar cone. Transmitted wave (red) travels from the source (blue) to the object (green) and back (purple). Source: physics-and-radio-electronics.com

The reflected wave is then used to determine the range, angle of arrival and speed of the object. This is achieved by measuring the time of flight of the wave. The velocity of the object is measured using the Doppler effect caused in the wave by the motion of the object.

Range measurement princible for time of flight senosrs. Source: Voyage

In addition to the angle of arrival estimation of the object, the relative velocity of the target is determined. As a result of range, angle of arrival and velocity measurements, a two or three dimensional point cloud is composed. The dimensions of the point cloud depend on the capability of the radar. The range and angle of arrival specify the spacial placement of the reflection points, while the velocity signifies the rate of change in that position.

Point cloud of a low resolution radar.

To exemplify why to use radar over LIDAR and video cameras, a more detailed description of its benefits and drawbacks  in comparison should be given. The use cases of radar are easily deductible from its capability of measuring the location of a object in space: object tracking. Using the range and relative velocity, various safety systems could be implemented, making travel safer for all parties involved. This is because these safety systems offload some of that mental load when driving, leaving more room for focusing on the more important aspects of driving.  On the other hand, this reduced mental stress can lead to carelessness when driving, leading to accidents. Despite that, the radar has two main advantages over LIDAR and video cameras:

  • Resilience to difficult weather – like mentioned previously, radar is capable of operating in fog, smoke and heavy rain- and snowfall. It is capable of tracking objects behind metal, wood and brick walls.
  • Accuracy of sensing – radar measures the location of object with the accuracy of a few centimeters with the range of a few dozen meters.  When the range increases to few hundred meters this accuracy is in the ball park of forty centimeters. Compared to the range of video cameras of about few hundred meters of range with exponentially decreasing accuracy, it is quite good.

Compared to other sensors, radar does have its own limitations and drawbacks. Removing stationary reflection points in the field of view of the radar leads to the introduction of invisible objects. There are two types of invisible objects: those that move at the same speed that the vehicle does and those that move straight at the vehicle. The first case is easily rectifiable, but increases clutter in the sensing area. The second one is more difficult. These reflection points are filtered for a reason. An example of a traffic sign above a lane on the highway can be used. As radars are used in adaptive cruise control, the breaking is done based on the radar measurements. Using these traffic sings as a input, the vehicle could potentially cause a traffic accident by breaking unexpectedly. This is true for stationary traffic signs on the ground and at the edge of the road. This scenario of invisible objects causing traffic accidents is common for autonomous vehicles.

The other major drawback is the noisy radar measurements in the field of view of the sensor, caused by the properties of the EM wave propagation. This is caused by the property of the radar EM wave not guaranteeing line of sight propagation. The waves can have multiple reflections, which manifest as secondary reflections with a wrong range. The reflections with a non-line of sight trajectory have a lower signal strength than the line of sight ones. That leaves room for algorithmical filtering using signal strength.

This kind of filtering based on signal strength is however guesswork to find the threshold, where the reflection is valid or not. Determining this threshold is possible only when the properties of the environment and its objects materials are known. This assumption is not feasible in the context of self-driving vehicles as the creation of this representation of the environment is the goal.

Point cloud of high resolution radar measurements. Source: OCULII

Now that the principles behind radar technology have been introduced, the real use cases can be introduced.  Since the resulting point clouds from measurements are insufficient to fully create context for autonomous driving, algorithmical filtering is required. As it can be seen from the previous picture, it is hard to create an association between an object and its point cloud. For the human eye however, the traffic situation and the general environment can be derived by just taking a glimpse of the picture. This kind of processing comes easily for humans, but requires much computational power for computers. For example, this data association for LIDAR and video camera is easier since the information density is often far higher. Radar, in contrast, offers reasonable information density, while providing a far better resilience to weather. To exemplify this, a brief list of current and future uses are listed, where radar is applicable:

  • Automotive industry
    • Included in safety systems as:
      • Collision warning and avoidance
      • Intelligent and adaptive cruise control
      • Parking-assist
      • Blind spot detection
  • Manufacturing industry
    • Safety systems on conveyor
    • Localization in predefined spaces
    • Agricultural systems for slippage and tilt detection
  • Robotics
    • Safe spaces
    • Robot pose estimation
    • Ground speed for slippage detection
  • Home automation
    • Light and door control
    • Space occupation detection

Examples of radar use cases. Source: Texas Instruments

As a conclusion, potential use cases are listed. In addition to the previously mentioned use cases, they can potentially be used as following:

  • Emergencies
    • Emergency vehicle navigation in smoke, fog and so on
    • Emergency workers for navigation in case of building fires
  • Self-driving cars
    • Mapping the environment
    • Detecting change in the environment
  • Robotics
    • Similarly to self-driving cars, autonomous robots could use them for:
      • Mapping
      • Localization
      • SLAM
      • Agricultural safety systems
  • Building automation
    • Zone occupancy:
      • Light and door control
      • Resource management optimization

The Ministry of Education and Research and Estonian Research Council are supporting the completion of the blog.