Self-driving cars can be forced to brake by hijacked billboards

Researchers demonstrate how “phantom objects” can fool autonomous vehicles and prompt particular actions.
Written by Charlie Osborne, Contributing Writer

Security researchers have demonstrated how hijacked billboards could be used to confuse self-driving cars -- forcing them to slam on the brakes, or worse. 

Autonomous driving systems have come on leaps and bounds in recent years, but not without mistakes, confusion, and accidents occurring. 

Vehicle intelligence has a long way to go before it could be considered fully autonomous and safe to use without the supervision of a human driver, and as technology firms continue to refine their platforms, the focus tends to be on weather conditions, mapping, and how cars should respond to hazardous objects -- such as people in the road or other cars.

See also: Tesla's Elon Musk: Some 'expert, careful' drivers get beta Full Self-Driving next week

However, as reported by Wired, there may be other, unseen hazards that humans cannot detect with the naked eye. 

New research conducted by academics from Israel's Ben Gurion University of the Negev suggests that so-called "phantom" images -- such as a stop sign created from flickering lights on an electronic billboard -- could confuse AI systems and prompt particular actions or movements.  

This could not only cause traffic jams but also more serious road accidents, with hackers leaving little evidence of their activities -- and leaving drivers perplexed over why their smart vehicle suddenly changed its behavior. 

CNET: Tesla Model S price drops to $69,420, seven-seat Model Y coming soon

Light projections spanning only a few frames and displayed on an electronic billboard could cause cars to "brake or swerve," security researcher Yisroel Mirsky told the publication, adding, "so somebody's car will just react, and they won't understand why."

Tests were performed on a vehicle using Tesla's latest version of Autopilot, and MobileEye. According to Wired, a phantom stop sign appearing for 0.42 seconds fooled the Tesla, whereas only 1/8th of a second was enough to dupe MobileEye. 

TechRepublic: IoT security: University creates new labels for devices to increase awareness for consumers

The experiments are founded on previous research that used split-second light projections -- such as the shape of a human being -- to confuse autonomous vehicles on the road. While these tests had the same effect, a digital billboard, in theory, would be more convenient to attackers seeking disruption on a wider scale.

The research is due to be presented in November at the ACM Computer and Communications Security conference.

Previous and related coverage

Have a tip? Get in touch securely via WhatsApp | Signal at +447713 025 499, or over at Keybase: charlie0

Editorial standards