60 Minutes and (En)Coded Bias

Gut Reaction

 Droplets of water building until the cup overflows
 You spend time. 
 Spend effort.
 And when you see a final product, 
 without your name- 
 without your history- 
 you wonder if you’re the 
 one who’s crazy. 

Recently, CBS’s 60-minutes aired a segment on racial bias in facial recognition technology, referring to a December 2019 National Institute of Standards and Technology (NIST) study as a “landmark study” while failing to mention the groundbreaking research on which the NIST study was based, and conducted by AI-research pioneers and Black women, Joy Buolamwini, Dr. Timnit Gebru and Inioluwa Deborah Raji

Ms. Buolamwini, who spent hours prepping the 60-minutes team, was summarily not given credit for her work, nor was she acknowledged as the one who made the work groundbreaking; thus erasing her from the narration while her work and knowledge were credited to what she refers to as “Pale Males.” 

Appalled, I reshared a LinkedIn by The Algorithmic Justice League as well as one by Ms. Buolamwini herself on my personal LinkedIn profile with the following comment hoping to inform others of how Black women continue to be treated as disposable within the tech community.

“Pay attention: this is what misogynoir looks and feels like”

Misogynoir is the racism, hatred, and prejudice aimed particularly at Black women. So, imagine my surprise when, upon sharing Ms. Buolamwini’s post with my audience on LinkedIn, the very content describing this egregious act went missing on the site. As I searched frantically for the posts that I shared on my page and the original, I never thought to check the comments on Ms. Buolamwini’s page because I’d accepted the failure was mine. But then I did read the comments.

What I found was unsettling. Others mentioned that their own shared posts with comments were missing. Further, Ms. Buolamwini’s own post on LinkedIn was also missing, although it remained present on Twitter. Still, rather than blaming LinkedIn’s algorithm for removing the content, I had the visceral gut-reaction to assume this was user error. After all, a “404 Error’’ means the URL I am trying to visit does not exist. If I shared it correctly I would see it, right? 

We have been trained to silence ourselves. We question our expertise rather than a system which proves it is unconcerned with Black people at most turns. This type of erasure, the kind that doesn’t occur with other non-Black individuals, shows up in ways we don’t recognize. Many experience being ignored in the mundane, everyday actions we perform. This cannot be removed or separated from white supremacy, because erasure is the enactment of white supremacy. Such is the evidence of psychological propensity to assume feeling invisible through even small bites of erasure, is an individual failure.

This is why Joy Buolamwini has advocated against AI use in many capacities. As one of the most well-known names in the tech industry, Ms. Buolamwini has been speaking up about the erasure of Black People in tech through her work in AI since before the publication of her 2018 paper coauthored by Dr. Timnit Gebru. The paper, entitled “Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification,” discusses the many deficiencies of Artificial Intelligence when recognizing race and gender. If you’re unfamiliar, please listen and read about her experience in The Coded Gaze: Bias in Artificial Intelligence, of having to don white face in order to be recognized by that very tool on which she worked.

At the simplest level, an algorithm set up by an industry controlled by white humans who are unconcerned with bias on a daily basis has become unintentionally-yet-intentionally, dangerous. Even at the most minute level, erasing someone who has warned the industry – the world – about its problem, is harmful and feels retaliatory.

With the rampant use of technology – especially facial recognition technology – in law enforcement, we see the results of this almost daily, and most especially in the violence it upholds against Black lives. We need to correct how we build technology. We need to question who’s included and excluded from the creation process.

Congress’ House Committee on Oversight and Reform held three meetings on AI and Facial Recognition Technology [1][2][3]. The first was in 2018 and yet in 2021 we are still struggling to hold tech companies accountable for the safety and the ability of people of color to be included even though the final recommendations suggest “the U.S. government can build and sustain an “AI ready” workforce, while also establishing appropriate oversight mechanisms to protect the civil liberties of American citizens.” 

Protect the civil liberties of all citizens regardless of citizenship.

Going back to even the small mundane interactions between people and technology, we know building in guardrails to prevent the repetition of harms that people perpetuate even without intent is necessary to protect us all from bias.

Niccolò Caranti, CC BY-SA 4.0 https://creativecommons.org/licenses/by-sa/4.0, via Wikimedia CommonsAs of the publishing of this, the posts on LinkedIn as well as those re-shared are still missing. What must Joy, Deb, and Timnit feel knowing their warnings have fallen on deaf ears? Even worse, these warnings are heard and publicized but their authors are erased from the conversation. What we know is this: tech is representative of the society for which it has been created and that society includes Black people as an afterthought. As usual, we do the labor but are removed and kept from the discussion. Building with inclusion in mind- having teams that reflect the world for which the tech is built- is necessary to prevent harm.

Joy Buolamwini: How I’m Fighting Bias in Algorithms TEDTalk

Dr. Dede Testubayashi’s (Deh-deh Teh-tsu-bye-ya-she) expertise is DEI + product + business value and integrating them into a team and organization's best practices. She has extensive experience building frameworks and guidelines to integrate product inclusion into the development process, and driving adoption as an integral portion of phased and prioritized roadmaps for teams to execute against. Dede is a member of the Equity Army run by Annie Jean-Baptiste, a group focused on educating organizations on Product Inclusion. She's also a founding member of Tech Ladies, a group focused on inclusivity in tech, and is working on two new publications; a memoir and a product inclusion guide.

Previous
Previous

The spirit of Juneteenth is acknowledgement

Next
Next

From Invisibility to Radical Empathy