By Marybeth McGinnis
A recent study found that the leading automated detection systems are less accurate in detecting pedestrians with darker skin tones.
On average, the study found that detection was five points less accurate for dark-skinned pedestrians than for light-skinned ones. This points toward the concept of predictive inequity: biases in automated technology models that lead to worse outcomes for people with darker skin. This is not the first study to bring up serious questions about inequitable and potentially dangerous impacts of AV and machine learning technology.
One critique of the study could be that the researchers did not use the datasets used by companies that manufacture AVs. That, however, is all the more to the point. Companies do not make their algorithms and datasets publicly available, making it difficult to ensure equitable safety and oversight.
These questions have never been more important. As AV technology continues to push forward, the country is in a moment of questioning what—and for whom—a “safe street” is. Critical for transportation policymakers and engineers is to consider the impacts on black and brown people when adopting and implementing new technologies. AV policy needs to ensure the safety of vulnerable road users, and that safety is equal, regardless of a pedestrian’s skin tone.
Photo Credit: Lukas Hartmann via Pexels, unmodified. License