Facial Recognition and Racial Bias
]it is estimated that almost half of American adults – over 117 million people, as of 2016 – have photos within a facial recognition network used by law enforcement. This participation occurs without consent, or even awareness, and is bolstered by a lack of legislative oversight. More disturbingly, however, the current implementation of these technologies involves significant racial bias, particularly against Black Americans.
-harvard.edu
In as many years, 3 Black men have had their lives upended by wrongful arrests. Robert Williams, Michael Oliver, and Nijeer Parks were misidentified by facial recognition software, arrested, and held under suspicion of crimes ranging from petty theft to assault of a police officer. For Parks, who was accused of the more serious crimes of assault and eluding the police, the fight to clear his name went on for the better part of a year. Before his case was thrown out of court, and his name cleared, Parks would go on to spend 10 days in jai,l all due to hyper-reliance on technology. In the later filed lawsuit against the Woodbridge Police Department, its affiliates and Idemia the company behind the facial recognition software, Parks alleged that proper investigative techniques were forgone in lieu of faulty technology.
Despite widely published research findings detailing the issues of misidentification of darker skinned faces by facial recognition technologies, law enforcement’s hyper-reliance remains. For BIPOC, and most notably, dark-skinned Black women (for whom misidentification occurs as often as 33% of the time compared to that of white men) this adds an added layer of concern.
“Automated systems are not inherently neutral. They reflect the priorities, preferences, and prejudices—the coded gaze—of those who have the power to mold artificial intelligence.”
Gendershades.org
BIPOC are more highly surveilled by law enforcement agencies, more likely to be arrested, more likely to receive harsher sentences when convicted, and most daunting of all—are most likely to die at the hands of law enforcement officers.
Wrongful arrests, even if/when eventually dismissed, have far-reaching consequences on the personal and professional lives, and mental health and wellness of the victims as well as that of their families.
Technological design should be just, equitable, and relationship-centered; it should be built with not just for all users and those impacted by its use. If we are not building for stress-use, fringe-use, and nefarious use, if we are not intentionally building for the most vulnerable among us and scaling out to close the gaps in literacy, accessibility, equity and actively designing for harm mitigation we are making the willful choice to perpetuate harm.