The Spool / Movies
“Coded Bias” lifts the mask off systemic bias in facial recognition software
Shalini Kantayya's documentary is a chilling primer on the ways technology reflects the flaws of the systems that create and use it.
Read also: popular streaming services that still offer a free trial>

Shalini Kantayya’s documentary is a chilling primer on the ways technology reflects the flaws of the systems that create and use it.

NOW STREAMING:

We often tout the objective eye of science and technology to solve the natural ills of society — prejudice, bias, systemic discrimination. But what Shalini Kantayya‘s documentary Coded Bias presupposes, and makes a damn convincing argument for, is the idea that AI is subject to the same biases as the people who program it. And in so doing, she highlights the deep-seated problems in using AI technologies on the general public, from facial recognition software to resume-sorting programs. Turns out, they don’t make us more color or gender-blind; they just dispense those judgments with even crueler efficiency.

It all starts with MIT Ph.D. student Joy Buolamwini, whose facial recognition AI experiments led to a surprising result: she, a dark-skinned Black woman, wasn’t being accurately read by the very software she was using. This prods her to probe deeper, Kantayya using her investigations into AI bias as an anchor to explore how companies and governments are misapplying these technologies around the world. The results are straightforward, clear-minded, and immensely disturbing to behold.

We learn of organizations like the UK’s Big Brother Watch, which advocate against, for instance, the use of facial-recognition cameras to scan people’s faces without their consent (complete with canvassers handing out flyers to remind passersby of their rights). There’s the Microsoft-built AI who turned virulently racist and sexist after sixteen hours of trolling, emulating the human behavior to which it was exposed. We also follow a number of other case studies, from an acclaimed teacher who was fired by an AI algorithm of teacher performance, and the downright Orwellian ‘social value scores’ in China.

Coded Bias
Coded Bias

Kantayya keeps the proceedings moving apace, keeping us grounded in everyday language and examples that allow the layman to relate to the more tech-headed issues on display. (She also reserves space to explore the way the media has shaped the way we think of AI.) The CG graphics and text overlays can sometimes feel less-than-polished (more advocacy video than feature documentary).

These are minor concerns, however, as the importance of the issue, and the clarity with which Kantayya conveys its urgency, wins out over a few chintzy graphics. The most effective, albeit strident, flourish is a HAL 9000-like AI that tells us, in its eerily-calm computer voice, that “sometimes, you don’t even know when I’m making these decisions” about your life, from approving your home loan to accusing you of a crime.

In a year where conversations about racial and criminal justice are more important than ever, Coded Bias‘ broader points about the way the tech industry projects its own gender and racial biases onto its technologies ring particularly true. As the doc points out, most of the AI programmers in the industry are men, particularly white men, and the datasets and frames of reference they use to program them inherently favor those biases, however unconscious they may be. They then pass these programs on to tech companies, who then use these vast archives of data as they like. It’s a startling thing to realize, especially when presented as directly as Kantayya does here.

Turns out, [AI programs] don’t make us more color or gender-blind; they just dispense those judgments with even crueler efficiency.

Underneath the more direct calls for technological justice, Kantayya wants us to know that there are smart, powerful, driven women and people of color in the industry who may be able to right the ship. “I’m so used to being underestimated,” Buolamwini says at one point to another group of female programmers, some of whom recount stories of men telling them “girls don’t need to learn math”. Given that she’s now a powerful figure in the fight for technological equity, and the founder of the Algorithmic Justice League, an organization committed to exposing biases and harms in AI, we’d do well to follow her example. In so doing, Coded Bias also advocates for the real solution to these problems: get the input and leadership of more women and POC in tech.

“There is no algorithm to identify what is just,” Kantayya’s film is quick to remind us. That’s our job; we have to apply human values of equity and fairness to technologies that are being implemented and rolled out by big business and law enforcement at rates faster than we can keep up with. As Coded Bias reminds us, the US is still the “Wild West” of these kinds of artificial intelligence companies, the fast pace of 21st-century life encouraging companies and governmental entities to adopt these programs without thought to how they might be applied against the poor and marginalized.

Much like the levels of information AI programs process to keep the engines of modern life going, it’s easy to gasp at the scope of Coded Bias’ implications — especially if you don’t match the gender or shade of the kinds of people who write the rules for these programs. But Kantayya’s work is thorough and concise, serving as vital advocacy for change against the system’s latest attempts to control us through information.

Coded Bias is playing in the Metrograph’s virtual cinema starting November 11th.

Coded Bias Trailer: