Self-Driving Cars Confused by Flashing Emergency Lights, Study Finds
Researchers explain how the confusion became so severe that the technology could no longer accurately identify objects on the road.
A fascinating new study has revealed how some camera-based automated driving systems can become confused by the flashing lights of emergency vehicles.
The paper, from researchers at the Ben-Gurion University of the Negev in Beersheba, Israel and Japanese tech firm Fujitsu, detailed how the confusion was so great that the tech was no longer able to confidently identify objects on the road.
And they have even come up with a name for the process – EpileptiCar, which they describe as a “digital epileptic seizure phenomenon that causes an object detector’s confidence score to fluctuate when exposed to an activated emergency vehicle flasher.”
As the abstract to the paper stated: “This vulnerability poses a significant risk, because not only does it cause autonomous vehicles to crash near emergency vehicles, but it can also be exploited by adversaries to cause such accidents.”
The research was prompted by investigations from the National Highway Traffic Safety Administration into 16 incidents involving Teslas fitted with driver assistance tech, where the EVs crashed into first responder vehicles stopped for emergencies on or beside the road. These caused 15 injuries and one death.
No Teslas were involved in the researchers’ testing; instead they used five commercially automated driving systems from HP, Pelsee, AZDOME, Imagebon and Rexing and four object detectors from YOLO, SSD, RetinaNet and Faster R-CNN.
The researchers explained: “An object detector is a computer vision model designed to identify and locate specific objects within an image or video by drawing bounding boxes around them and providing a classification of their types.”
These were assessed against 14 different patterns of emergency lights – and the evidence at the end of the testing was clear. Intense flashing lights caused EpileptiCar, whereby the light “alters the tonal distribution of the vehicle’s colors in the camera frame, causing the object detector to misinterpret the visual data.”
In addition, several factors had a major influence on the extent of the problem. It was found to be more pronounced in low light conditions; different patterns had different effects; and proximity to the flashing light was also important.
Handily, the researchers have also come up with a solution to the problem, which also has a snappy handle – Caracetomol. It’s essentially a software framework specifically designed to train object detectors to be better able to identify flashing lights.
What the research did not do was find if EpileptiCar was behind the Tesla crashes. The researchers said: “We do not know. Tesla vehicles fuse data obtained by additional sensors (e.g. radars and ultrasonic sensors) beyond the video cameras. Our work only focused on video cameras.”
They also pointed out the tech they tested may not be fitted to Tesla cars.
However, they hope the wider industry will take their work into account during development and testing. The paper concluded: “We hope that this will encourage the automotive industry to validate our findings on their ADASs and semi-autonomous cars with their object detectors.”
About the Author
You May Also Like