Coded Bias (Netflix Documentary, 2020)
Director: Shalini Kantayya
Director: Shalini Kantayya
If passing the Turing test marks the acceptance of an automaton as a legitimate thinking body, we must also have a test to ascertain whether we have enough intelligence to be identified as a full-bodied homosapien at all. We think we are wise, but we repeatedly fall prey to sweet talks and indulgences in a single minute's pleasure, only to brood it all the morning after. We give away all our personal and intimate information willingly, only to realise much later that it has been used against us by the powerful. In the name of the country and doing good deeds, we surrender, only to be led to the slaughter.
Even when it comes to sending someone to the guillotine, there is discrimination. This, an MIT computer scientist, Joey Buolamwini, found it the hard way. When working on a facial identification device, she increasingly realised that machines repeatedly falter in identifying black and brown faces. When she wore a white mask over her face, she did not encounter such problems. With this startling discovery, together with other data scientists, mathematicians, human rights lawyers and other watchdog groups, she went on a crusade of exposing discrimination by algorithms.
A recent fiasco involving the UK A-level examination is testimony to this. After being cooped at home with frequent disruption in their studies, the Education Department decided to use AI to churn out students' final results based on specific preset parameters. That opened the floodgates of discontent amongst public school students and their parents from the not-so-affluent side of town and the minority groups. It also showed private school students performing significantly better. The algorithm-based results proved to be biased against students from poorer backgrounds.
![]() |
Replica of Maschinenmensh (Human Machine) @ Maria in Metropolis (1929) |
The big conglomerates which can afford to pay for this enormous amount of data can streamline their business strategies to meet their self-serving ambitions. Algorithms use the information from data to stereotype females, non-whites and the marginalised to give a bad deal in resumês, job applications, eligibility for loans, and suspects in criminal activities. Men and fair-skinned individuals always fared favourably via algorithm selection.
The problem with the whole thing is that the person at the short end of the stick has no means to appeal his rejection. His plea for reconsideration is only met with chatbots or individuals who are powerless in changing anything. The algorithm's choices are worshipped as if they are God-sent decrees cast on stones. Are these unquestionable orders made by 'She-who-must-be-obeyed' the fictitious Mrs Hilda Rumpole in John Mortimer's 'Rumpole on the Bailey'? We have not learned from 'Frankenstein' and 'Maschinenmensh' (Metropolis 1929) that man-made creations invariably go berserk or against their creators.
Comments
Post a Comment