‘Coded Bias’: This Netflix Documentary Asks If AIs Can Be Biased, And The Answer Is Terrifying In Range

It all begins with a disturbing observation, but which can be reduced to anecdote: an African-American researcher, Joy Buolamwini, realizes that a facial recognition program does not distinguish or identify your face as that of a person quantifiable for your database. But it does when you put on a neutral … and white mask.

It is the starting point of a documentary that pecks at many topics, all from the arbitrariness and lack of ethics with which algorithms collect information to shape their databases and the knowledge with which they are fattening different AIs. An arbitrariness that takes shape from prejudices that we all have and that make, for example, and as one of the participants in this interesting documentary says, racism is mechanized and replicated.

'Of love and monsters': a nice comedy comes to Netflix that shows that with romance, the apocalypse is more bearable

In another moment of the action, the documentary tells the experience of a young Chinese woman with the constant surveillance and identification system that exists in her country, and the social credit system. Next, an American expert affirms that the system is not so different from Western surveillance of citizens through social networks … but that at least in China the government recognizes it.

In this way, the documentary continuously travels around the world, asking itself questions about privacy, technology and how the new computer models impose restrictions on us that in many respects we thought we had overcome. ‘Codified bias’ does not have a unambiguous and indisputable speech, but it does throw the viewer a lot of questions that we urgently need to ask ourselves (and above all, answer).

The racism of the algorithm

‘Coded bias’ is part of a series of recent technology documentaries that warn of the dangers of very similar issues: the loss of intimacy, how technology inadvertently enters aspects that previously belonged to humans and, in general, the risk of losing control of what we ourselves have created. Both ‘The Great Hack’ and ‘The Social Media Dilemma’, which can also be seen on Netflix, focused on the uncontrolled disintegration of the private.

READ:  Celeste creator promises a major announcement for tomorrow

‘Coded bias’ also comes into this topic, but as part of a larger problem: the abstract threat of a decentralized mathematical algorithm that learns, but learns poorly. And yes, we do get into the issue of electoral manipulation (not always premeditated, and that is one of the most disturbing questions of the whole: there are no supervillains here, but failures of society as a whole), but also in the erosion of individual rights, such as the use by British or US law enforcement agencies of databases based on prejudicial data.

The 11 best tech documentaries to watch on Netflix

It is in those moments when the documentary plays with a tremendousness that threatens to frivolize its discourse. Winks to films like the adaptation of ‘1984’ or the evil AI of ‘2001: A space odyssey’, with a robotic voice speaking in the first person, are friendly but collide with the solemn seriousness that demands what is being told. Fortunately, ‘Codified bias’ and its director Shalini Kantayya (one of the experts who appear speaking in the film) do not forget that we are talking about technology, but above all, about its impact on people.

“AI is based on data, and the data is a reflection of our history,” says Buolamwini in the documentary, and that is the essence of the something. bitter conclusion: AIs are prejudiced because people are full of prejudices. But on the other side of the coin, the documentary is optimistic and closes with a small triumph of the researcher against the mathematical machinery, with a nice technopoem and with a call to action. It is in our power to curb many of the excesses that “Codified bias” denounces, and from that point of view, despite the warnings, he knows how to go further than other documentaries of his style.

LEAVE A REPLY

Please enter your comment!
Please enter your name here