Earlier this month, we released BAD INPUT, a three-part series of short films from CR exploring how bias in technology and data can result in real-life consequences for communities of color. BAD INPUT is a collaboration with Kapor Foundation, a national nonprofit dedicated to racial justice, and filmmaker Alice Gu.
Whether it’s applying for a loan, getting an auto insurance quote, or accessing medical care, we depend on technology to be neutral. But when systems are built on flawed data, or don’t account for a diversity of lived experiences, they produce outcomes that are anything but neutral. BAD INPUT tells the story of the risks hidden in seemingly “neutral” tech, and the people working to make systems more fair:
- Chapter 1 – Medical Devices explores how poorly-designed sensors made the pandemic more deadly for people of color, featuring Thomas Valley from the University of Michigan and Crystal Grant of the American Civil Liberties Union.
- Chapter 2 – Mortgage Lending explores how discriminatory practices in mortgage lending are alive and well in the digital era, featuring Michael Akinwumi of the National Fair Housing Alliance, Melissa Koide from FinRegLab and Kareem Saleh of FairPlay AI.
- Chapter 3 – Facial Recognition explores the implications of widespread digital surveillance enabled by our connected consumer devices, and what happens when innocent people are caught up in digital dragnets. Featuring Timnit Gebru of The Distributed AI Research Institute, privacy researcher Chris Gilliard and Vinhcent Le of The Greenlining Institute.
How You Can Take Action For Algorithmic Justice
CR is showcasing the work of community partners on our landing page. We are inviting you to explore the work of these partners and how you can get involved, from algorithmic audits and accountability to stronger consumer protections. We encourage you to screen and share the films in your own communities —we’ve prepared a conversation guide to get you started.
You can also join our petition here to join CR and our community partners, who are working toward a world where no one is negatively impacted by an inherently biased computer program.