Skip main navigation

Hurry, only 11 days left to get one year of Unlimited learning for £249.99 £174.99. New subscribers only. T&Cs apply

Find out more

Interview with Shail Patel on decolonisation of, and anti-racism in, the mathematical sciences

This video is an interview with Shail Patel who talks about decolonisation of, and anti-racism in, the mathematical sciences.

[Transcript of video is available at the bottom of this page]
In January 2020 Robert Julian-Borchak Williams of Detroit was arrested in front of his wife and children, interrogated, held for 30 hours in a detention centre, his DNA and fingerprints taken – all solely on the basis of a face recognition algorithm. He was later released and in court all charges were dropped. Mr Williams is black [1]. In this modern age we are surrounded by algorithms, equations that do things. These may be advertised in smart phones, smart TVs or smart cars, or hidden when we apply for a loan [2], search on the internet [3] or go to a hospital [4]. These algorithms typically revolve around data, and make predictions based on that data. When the data involves human beings, human biases and prejudices are in danger of affecting the behaviour of the algorithm. This can have potentially disastrous effects on people’s lives [5]. Through easy to understand recent examples Shail Patel explains how this arises and what we can do to counter these effects [6].

References

(1) Kashmir Hill, Wrongly accused by an Algorithm, New York Times, 24.6.20 https://www.nytimes.com/2020/06/24/technology/facial-recognition-arrest.html

(2) Bertrand, Jérémie, and Laurent Weill. “Do Algorithms Discriminate Against African Americans in Lending?.” Présentation à la Conférence Fintech and Digital Finance, Skema, Nice. 2019.

(3) Safiya Umoja Noble, (2020) Algorithms of Oppression: How Search Engines Reinforce Racism, NYU Press

(4) Ledford, Heidi. “Millions of black people affected by racial bias in health-care algorithms.” Nature, vol. 574, no. 7780, 2019, p. 608+

(5) Richardson, Rashida and Schultz, Jason and Crawford, Kate, Dirty Data, Bad Predictions: How Civil Rights Violations Impact Police Data, Predictive Policing Systems, and Justice (February 13, 2019). 94 N.Y.U. Law Review Online 192 (2019), https://ssrn.com/abstract=3333423

(6) Kusner, Matt & Loftus, Joshua. (2020). The long road to fairer algorithms. Nature. 578. 34-36. 10.1038/d41586-020-00274-3.

(7) Julian Goldsmith, A Level Shambles has lessons for Justice, Law Society Gazette, 24.8.20 https://www.lawgazette.co.uk/commentary-and-opinion/a-level-shambles-has-lessons-for-justice/5105403.article

Shail Patel is retired, previously Research Director for Digital Consumers & Markets, Unilever R&D

This article is from the free online

Decolonising Education: From Theory to Practice

Created by
FutureLearn - Learning For Life

Reach your personal and professional goals

Unlock access to hundreds of expert online courses and degrees from top universities and educators to gain accredited qualifications and professional CV-building certificates.

Join over 18 million learners to launch, switch or build upon your career, all at your own pace, across a wide range of topic areas.

Start Learning now