s08e12: This Is Your Regular Reminder
0.0 Context setting
Just read this one, please.
1.0 One Thing That Should Catch Your Attention
1.1 This Is Your Regular Reminder
Adapted and cleaned up from this bunch of tweets that you may not have seen because, unlike me, you have a healthier relationship with Twitter.
This is your regular reminder that when people were saying machine learning, AI and deep learning would solve the problem of content moderation and hate speech, they ended up making racist AI that was one a half times more likely to label American tweets as offensive. [Vox/Recode, reporting on The Risk of Racial Bias in Hate Speech Detection and Racial Bias in Hate Speech and Abusive Language Detection Datasets, 2019]
This is your regular reminder that when algorithms — rules, really — are encoded in software to make, or help make, decisions on how to treat people. These systems, used in hospitals, are racist and “systemically discriminate against black people”. They mean black people are less likely to be referred than equally sick white people to programmes that aim to improve care for patients with complex medical needs. [Nature, Millions of black people affected by racial bias in health-care algorithms, 2019]
This is your regular reminder that face recognition software has a false positive rate 10 to 100 times worse for African Americans. The worst false match rate was for African American women. A false positive or false match rate is when a match is found when there isn’t actually a match. [MIT Technology Review reporting on Part 3 of the NIST Face Recognition Vendor Test, 2019]
This is your regular reminder that in 2009 — over ten years ago — Hewlett Packard released a webcam that didn’t recognize black people because “the technology we use is built on standard algorithms that measure the difference in intensity of contrast between the eyes and the upper cheek and nose. We believe that the camera might have difficulty 'seeing' contrast in conditions where there is insufficient foreground lighting”. [The Guardian, Are Hewlett-Packard computers really racist?, 2009]
This is your regular reminder that speech recognition used by Amazon, Apple, Google, IBM and Microsoft on average misidentified words around 35% of the time for African Americans, compared to around 19% of the time for white people. Imagine one third of every word you speak being misunderstood. [Racial disparities in automated speech recognition, 2020]
This is your regular reminder that despite all of the above, people and organizations keep wanting AI and machine learning to be used in schools. [AI can disrupt racial inequity in schools, or make it much worse, 2019]
This is your regular reminder that despite all of the above, that despite facial recognition software misidentifying African American people more often than white people, police departments have access to software that can “identify” people in crowds. [Buzzfeed News, Many Police Departments Have Software That Can Identify People In Crowds, 2020]
This is your regular reminder to read Dr. Safiya Noble’s Algorithms of Oppression. You can read and watch this interview with her from USC Annenberg.
This is your regular reminder to read and share with your colleagues and friends articles like Of course technology perpetuates racism. It was designed that way. by NYU Steinhard’s Vice Provost for Faculty Engagement and Development; Professor of Media, Culture, and Communication Charlton McIlwain.
This is your regular reminder to read, share and act upon resources like the Anti-Racist Resource Guide for the Tech & VC Community from the Female Founders Fund.
This is your regular reminder to read books like Virgina Eubanks’ Automating Inequality: How High-Tech Tools Profile, Police and Punish the Poor.
This is your regular reminder to follow organizations like POCIT, People of Color in Tech, and to encourage your hiring managers and HR department to use their job board.
This is your regular reminder to donate to organizations like the NAACP Legal Defense Fund, the National Civil Rights Museum and the Marsha P. Johnson Institute.
This is your regular reminder that technology — and right now, and most especially, software and hardware and their interrelated systems — is not neutral.
This is your regular reminder that technology reflects the people who make it.
This is your regular reminder that technology reflects the desires, actions and aims of the people who use it.
This is your regular reminder that technology reflects history. That it reflects structure. That it reflects culture.
This is your regular reminder that technology reflects power.
Black lives matter.
That’s it. That’s the newsletter.
I’m not fine.
How are you?
Dan.