Spoofing LIDAR could blind autonomous vehicles to obstacles

Humans manage to drive acceptably using only two eyes and two ears to sense the world around them. Autonomous vehicles are equipped with sensor boxes that are altogether more complex. They typically rely on radar, lidar, ultrasonic sensors, or cameras all working in concert to detect upcoming road conditions.

While humans are quite cunning and difficult to deceive, our robot friends are less hardy. Some researchers worry that LiDAR sensors could be spoofed, obscuring obstacles and causing collisions with driverless cars, or worse.

Using a laser to send false echoes back to a LiDAR sensor on an autonomous vehicle can be used to conceal objects in its field of view Credit: research paper, Cao, Yulong and Bhupathiraju, S. Hrushikesh and Naghavi, Pirouz and Sugawara, Takeshi and Mao, Z. Morley and Rampazzi, Sara

LiDAR is so named because it is an equivalent of radar technology. Unlike radar, however, it is still generally treated as an acronym rather than a word in its own right. The technology sends out laser pulses and captures reflected light by the environment.Pulses returning from more distant objects take longer to return to the LiDAR sensor, allowing the sensor to determine the range even objects around it. It is generally considered the benchmark sensor for autonomous driving. This is due to its greater accuracy and reliability compared to radar for object detection in automotive environments. In addition, it offers very detailed depth data that is simply not available with an ordinary 2D camera.

A new research paper has demonstrated a contradictory method to fool LiDAR sensors. The method uses a laser to selectively mask certain objects so that they cannot be "seen" by the LiDAR sensor. The article calls this a "physical removal attack," or PRA.

The attack theory is based on how LiDAR sensors work. Typically, these sensors prioritize stronger reflection over weaker ones. This means that a strong signal sent by an attacker will take priority over a weaker reflection from the environment. LiDAR sensors and the autonomous driving frameworks above them also typically reject detections below a certain minimum distance from the sensor. This is usually on order from 50mm to 1000mm distance.

The attack works by firing infrared laser pulses that mimic the actual echoes the LiDAR device expects to receive. The pulses are timed to match the trigger time of the victim LiDAR sensor, to control the perceived location of the points spoofed by the sensor. By firing light laser pulses to mimic echoes at the sensor, the sensor will generally ignore weaker true echoes picked up from an object in its field of view. This alone may be enough to hide the obstacle from the LiDAR sensor, but would appear to create a spoofed object very close to the sensor. However, since many LiDAR sensors reject excessively close echo returns, the sensor will likely reject them entirely. If the sensor does not remove the data, the filtering software running on its point cloud output may do so itself. The resulting effect is that the LiDAR will not display any valid point cloud data in an area where it should detect an obstacle.

The attack requires some knowledge, but is surprisingly practical to perform. It only takes a little research to target different types of LiDARs used on autonomous vehicles to create a suitable spoofing device. The to...

Spoofing LIDAR could blind autonomous vehicles to obstacles

Humans manage to drive acceptably using only two eyes and two ears to sense the world around them. Autonomous vehicles are equipped with sensor boxes that are altogether more complex. They typically rely on radar, lidar, ultrasonic sensors, or cameras all working in concert to detect upcoming road conditions.

While humans are quite cunning and difficult to deceive, our robot friends are less hardy. Some researchers worry that LiDAR sensors could be spoofed, obscuring obstacles and causing collisions with driverless cars, or worse.

Using a laser to send false echoes back to a LiDAR sensor on an autonomous vehicle can be used to conceal objects in its field of view Credit: research paper, Cao, Yulong and Bhupathiraju, S. Hrushikesh and Naghavi, Pirouz and Sugawara, Takeshi and Mao, Z. Morley and Rampazzi, Sara

LiDAR is so named because it is an equivalent of radar technology. Unlike radar, however, it is still generally treated as an acronym rather than a word in its own right. The technology sends out laser pulses and captures reflected light by the environment.Pulses returning from more distant objects take longer to return to the LiDAR sensor, allowing the sensor to determine the range even objects around it. It is generally considered the benchmark sensor for autonomous driving. This is due to its greater accuracy and reliability compared to radar for object detection in automotive environments. In addition, it offers very detailed depth data that is simply not available with an ordinary 2D camera.

A new research paper has demonstrated a contradictory method to fool LiDAR sensors. The method uses a laser to selectively mask certain objects so that they cannot be "seen" by the LiDAR sensor. The article calls this a "physical removal attack," or PRA.

The attack theory is based on how LiDAR sensors work. Typically, these sensors prioritize stronger reflection over weaker ones. This means that a strong signal sent by an attacker will take priority over a weaker reflection from the environment. LiDAR sensors and the autonomous driving frameworks above them also typically reject detections below a certain minimum distance from the sensor. This is usually on order from 50mm to 1000mm distance.

The attack works by firing infrared laser pulses that mimic the actual echoes the LiDAR device expects to receive. The pulses are timed to match the trigger time of the victim LiDAR sensor, to control the perceived location of the points spoofed by the sensor. By firing light laser pulses to mimic echoes at the sensor, the sensor will generally ignore weaker true echoes picked up from an object in its field of view. This alone may be enough to hide the obstacle from the LiDAR sensor, but would appear to create a spoofed object very close to the sensor. However, since many LiDAR sensors reject excessively close echo returns, the sensor will likely reject them entirely. If the sensor does not remove the data, the filtering software running on its point cloud output may do so itself. The resulting effect is that the LiDAR will not display any valid point cloud data in an area where it should detect an obstacle.

The attack requires some knowledge, but is surprisingly practical to perform. It only takes a little research to target different types of LiDARs used on autonomous vehicles to create a suitable spoofing device. The to...

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow