Self-Driving Cars May Be Biased Against Skin Color, Children
Research found the disparities after assessing AI-based pedestrian detectors used in developing driverless vehicles
Autonomous vehicles may have more difficulty in detecting children and people with darker skin, according to a new study.
Research carried out by a team of academics from King’s College, London and Peking University, China, concluded after it assessed eight artificial intelligence-based pedestrian detectors used in driverless cars’ development.
Alarmingly, detection accuracy for adults was found to be 19.62% higher than it was for children, while there was a 7.52% disparity between light-skinned and dark-skinned individuals.
In contrast, there was only a 1.1% difference in detection accuracy based on gender.
The study, which is yet to be peer-reviewed, required data sets that include demographic labels for pedestrians. However, as the paper, “Unmasking Fairness Issues of Autonomous Driving Systems” noted, the commonly-used data sets for pedestrian detection often lack labels of this nature.
As such, the only option was to manually add labels to four data sets, which resulted in a collection of 8,311 real-world images annotated with 16,070 gender labels, 20,115 age labels and 3,513 skin tone labels.
After this had been done, the team was able to conduct fairness testing on the pedestrian detectors. And that’s when the alarming disparities in detection came to light.
Of particular concern was the discovery that “detection performance for the dark-skin group decreases under low-brightness and low-contrast conditions compared to the light-skin group.” The team reported a rise in those undetected from 7.14% to 9.86% between daytime and nighttime scenarios.
This is especially worrying, given that car manufacturers and AV developers are likely using similar software and data sets in the development process. Although this cannot be confirmed – because the automakers have a right to confidentiality – it seems almost inevitable, because they tend to rely on the latest open-source models.
And because the main open-source collections of pedestrian images used to train AI models feature more people with light skin than dark skin, there is a danger of an inherent bias being built in.
The solution, the study believes, is simple. It states: “Prioritizing group fairness when building software has emerged as an essential ethical duty and requirement for software engineers.”
And it adds that there needs to be more government involvement in the process. “It is essential for policymakers to enact laws and regulations that safeguard the rights of all individuals and address these concerns appropriately.”
The research is not the first to highlight how people with darker skin are more at risk of being hit by an autonomous car. A similar report, Predictive Inequity In Object Detection, was produced by academics at the Georgia Institute of Technology in 2019.
The latest research can be downloaded as a PDF here.
About the Author
You May Also Like