Dark
Light

Beware Autonomous vehicles at risk of road object deception.

March 1, 2024

TLDR:

  • Researchers at UCI and Keio University have found vulnerabilities in LiDAR technology used in autonomous vehicles.
  • They were able to trick the LiDAR system into perceiving fake objects or missing real ones, potentially leading to unsafe driving behaviors.

The research conducted by the team focused on spoofing attacks on commercially available LiDAR systems used in autonomous vehicles. They found that both first-generation and newer-generation systems exhibited safety deficiencies. Through a combination of real-world testing and computer modeling, the researchers identified 15 new findings that could inform the design and manufacture of future autonomous vehicle systems.

First-generation LiDAR systems were vulnerable to a “fake object injection” attack, where sensors were tricked into perceiving non-existent objects in the roadway. On the other hand, newer-generation systems employed countermeasures such as timing randomization and pulse fingerprinting to prevent this type of attack. However, the researchers were able to develop a method to confuse even the newer LiDAR systems by concealing existing vehicles from their sensors.

The potential implications of these vulnerabilities include triggering unsafe driving behaviors in autonomous vehicles, such as emergency braking and front collisions. The study sheds light on the need for robust cybersecurity measures in autonomous vehicle technologies to ensure the safety of passengers and pedestrians on the road.