Skip main navigation

Interview with Shail Patel on decolonisation of, and anti-racism in, the mathematical sciences

This video is an interview with Shail Patel who talks about decolonisation of, and anti-racism in, the mathematical sciences.
8.7
i<00:00:08.800> want<00:00:08.960> to<00:00:09.040> say<00:00:09.200> a<00:00:09.280> big<00:00:09.519> thank<00:00:09.840> you i want to say a big thank you
10.2
i want to say a big thank you out<00:00:10.400> to<00:00:10.559> the<00:00:10.639> black<00:00:10.960> lives<00:00:11.360> matter<00:00:12.080> people<00:00:12.719> i out to the black lives matter people i
12.9
out to the black lives matter people i think<00:00:13.120> really<00:00:13.440> we<00:00:13.599> wouldn't<00:00:13.920> be<00:00:14.160> here think really we wouldn’t be here
14.6
think really we wouldn’t be here having<00:00:14.880> this<00:00:15.040> conversation<00:00:15.759> now<00:00:16.080> if<00:00:16.240> it having this conversation now if it
16.3
having this conversation now if it wasn’t<00:00:16.560> for<00:00:16.720> all<00:00:16.880> the<00:00:16.960> great<00:00:17.199> work<00:00:17.520> they've wasn’t for all the great work they’ve wasn’t for all the great work they’ve done done
19.1
done for<00:00:19.359> me<00:00:20.400> i<00:00:20.640> think<00:00:20.800> the<00:00:20.960> whole<00:00:21.279> issue<00:00:21.520> of for me i think the whole issue of
21.8
for me i think the whole issue of decolonization<00:00:22.880> has<00:00:23.199> more decolonization has more
23.8
decolonization has more urgency<00:00:24.960> in<00:00:25.039> the<00:00:25.199> social<00:00:25.680> sciences urgency in the social sciences
27.4
urgency in the social sciences subjects<00:00:27.760> like<00:00:28.000> history<00:00:28.720> art<00:00:29.119> literature subjects like history art literature
29.6
subjects like history art literature whatever<00:00:30.080> where<00:00:30.320> you<00:00:30.400> can<00:00:30.560> really<00:00:30.880> see<00:00:31.119> strong whatever where you can really see strong
31.5
whatever where you can really see strong in<00:00:31.760> colonial<00:00:32.480> influences in colonial influences
34.1
in colonial influences in<00:00:34.559> the<00:00:34.640> mathematical<00:00:35.360> sciences<00:00:36.079> and<00:00:36.320> in in the mathematical sciences and in in the mathematical sciences and in stems stems
38.6
stems subjects<00:00:39.600> for<00:00:40.000> in<00:00:40.160> general<00:00:41.440> my<00:00:41.680> feeling<00:00:42.079> is subjects for in general my feeling is subjects for in general my feeling is that that
43.2
that anti-racism<00:00:44.640> is<00:00:44.960> more<00:00:45.360> of<00:00:45.520> a<00:00:45.680> burning anti-racism is more of a burning
46.7
anti-racism is more of a burning important<00:00:47.360> issue so<00:00:53.440> i'm<00:00:53.600> going<00:00:53.760> to<00:00:53.840> substitute<00:00:54.559> that so i’m going to substitute that
55
so i’m going to substitute that anti-racism<00:00:55.920> into<00:00:56.079> that<00:00:56.320> question anti-racism into that question
57
anti-racism into that question and<00:00:57.120> say<00:00:57.440> okay<00:00:57.680> well<00:00:57.840> there's<00:00:58.160> two<00:00:58.480> levels<00:00:59.359> at and say okay well there’s two levels at and say okay well there’s two levels at which which
60
which anti-racism<00:01:01.520> racism<00:01:02.239> features anti-racism racism features
63.4
anti-racism racism features one<00:01:03.600> is<00:01:03.760> at<00:01:03.840> the<00:01:04.000> level<00:01:04.320> of<00:01:04.479> structural<00:01:05.199> racism one is at the level of structural racism
66.2
one is at the level of structural racism you<00:01:06.400> have you have
66.7
you have you<00:01:06.880> look<00:01:07.040> at<00:01:07.200> sort<00:01:07.439> of<00:01:07.600> staffing<00:01:08.400> and<00:01:08.560> the<00:01:09.040> you you look at sort of staffing and the you
69.2
you look at sort of staffing and the you know<00:01:09.360> the<00:01:09.520> number<00:01:09.760> of<00:01:09.920> people<00:01:10.159> of<00:01:10.400> color know the number of people of color
71.4
know the number of people of color at<00:01:11.600> particularly<00:01:12.159> at<00:01:12.400> senior<00:01:12.960> staff<00:01:13.280> levels at particularly at senior staff levels
73.7
at particularly at senior staff levels within<00:01:14.000> university within university
75.3
within university you<00:01:15.439> look<00:01:15.600> at<00:01:15.759> university<00:01:16.479> entrance<00:01:17.119> and<00:01:17.200> the you look at university entrance and the
77.4
you look at university entrance and the difficulty<00:01:17.920> it<00:01:18.080> is<00:01:18.159> for<00:01:18.320> people<00:01:18.640> of difficulty it is for people of
79
difficulty it is for people of color<00:01:19.280> to<00:01:19.439> get<00:01:19.680> into<00:01:20.000> university<00:01:21.439> you<00:01:21.600> look<00:01:21.759> at color to get into university you look at
81.9
color to get into university you look at racism<00:01:22.479> now<00:01:22.799> been<00:01:22.960> reported<00:01:23.520> in<00:01:23.680> lecture racism now been reported in lecture racism now been reported in lecture halls halls
84.9
halls actually<00:01:25.520> in<00:01:26.080> you<00:01:26.240> know<00:01:26.640> in<00:01:27.200> student<00:01:27.680> halls actually in you know in student halls
88.2
actually in you know in student halls and<00:01:28.320> things<00:01:28.560> like<00:01:28.799> that<00:01:29.920> and<00:01:30.320> these<00:01:30.560> are and things like that and these are
90.6
and things like that and these are serious<00:01:31.040> structural<00:01:31.520> issues<00:01:31.840> which<00:01:32.159> do<00:01:32.320> not serious structural issues which do not
92.6
serious structural issues which do not affect<00:01:32.799> the<00:01:32.880> mathematical<00:01:33.600> sciences affect the mathematical sciences
94.8
affect the mathematical sciences alone<00:01:36.000> but<00:01:36.240> affect<00:01:36.720> them<00:01:37.200> as<00:01:37.680> well alone but affect them as well
98.2
alone but affect them as well as<00:01:38.560> everybody<00:01:39.040> else<00:01:39.920> so<00:01:40.079> i'm<00:01:40.159> not<00:01:40.320> going<00:01:40.479> to<00:01:40.560> go as everybody else so i’m not going to go
100.9
as everybody else so i’m not going to go into<00:01:41.119> that<00:01:41.360> in<00:01:41.520> more<00:01:41.759> detail<00:01:42.159> but<00:01:42.320> i<00:01:42.399> wanted<00:01:42.640> to into that in more detail but i wanted to
102.8
into that in more detail but i wanted to flag<00:01:43.200> up<00:01:43.439> that<00:01:43.920> level flag up that level
104.6
flag up that level i<00:01:44.720> think<00:01:45.040> is<00:01:45.280> probably<00:01:46.320> the<00:01:46.479> most<00:01:46.880> important i think is probably the most important i think is probably the most important level level
108.5
level to<00:01:48.720> engage<00:01:49.040> with<00:01:49.280> anti-racism<00:01:50.399> so<00:01:50.640> the<00:01:50.880> second to engage with anti-racism so the second to engage with anti-racism so the second level level
112.5
level is<00:01:52.640> the<00:01:52.960> level<00:01:53.280> of<00:01:53.439> curriculum<00:01:55.360> and<00:01:55.520> again<00:01:55.759> i'm is the level of curriculum and again i’m
115.8
is the level of curriculum and again i’m going<00:01:55.920> to<00:01:56.000> take<00:01:56.240> mathematical<00:01:56.960> sciences<00:01:57.759> as<00:01:57.920> a going to take mathematical sciences as a
118
going to take mathematical sciences as a sort<00:01:58.159> of<00:01:58.320> a<00:01:58.399> broad<00:01:58.799> area<00:01:59.119> it's<00:01:59.200> not<00:01:59.360> just sort of a broad area it’s not just
120
sort of a broad area it’s not just sort<00:02:00.159> of<00:02:00.320> pure<00:02:00.640> mathematics<00:02:01.360> or<00:02:01.439> even<00:02:01.759> applied sort of pure mathematics or even applied
122.1
sort of pure mathematics or even applied mathematics<00:02:02.719> but mathematics but
123.6
mathematics but computer<00:02:04.079> science<00:02:04.640> statistics<00:02:05.680> data computer science statistics data computer science statistics data modeling modeling
126.8
modeling things<00:02:07.040> like<00:02:07.280> this<00:02:08.560> artificial<00:02:09.119> intelligence things like this artificial intelligence
129.8
things like this artificial intelligence which<00:02:10.000> is<00:02:10.080> the<00:02:10.160> big<00:02:10.399> one which is the big one
132.3
which is the big one so<00:02:13.760> you<00:02:13.920> know<00:02:14.000> when<00:02:14.160> i<00:02:14.319> first<00:02:14.560> started so you know when i first started
134.9
so you know when i first started thinking<00:02:15.200> about<00:02:15.360> this<00:02:15.520> i<00:02:15.599> thought thinking about this i thought
135.9
thinking about this i thought how<00:02:16.000> can<00:02:16.480> an<00:02:16.640> equation<00:02:17.200> be<00:02:17.360> biased<00:02:18.480> you<00:02:18.560> know how can an equation be biased you know how can an equation be biased you know an an
139
an equation<00:02:19.520> is<00:02:19.920> an<00:02:20.080> equation<00:02:20.640> how<00:02:20.800> can<00:02:21.040> it<00:02:21.120> have equation is an equation how can it have
141.8
equation is an equation how can it have racism<00:02:22.480> in<00:02:22.640> it<00:02:22.800> and if<00:02:25.200> you<00:02:25.360> look<00:02:26.080> deeper<00:02:27.680> there if you look deeper there
148.6
if you look deeper there these<00:02:28.959> algorithms<00:02:30.080> are<00:02:30.239> not<00:02:30.400> just<00:02:30.640> biased<00:02:31.120> and these algorithms are not just biased and
151.3
these algorithms are not just biased and even<00:02:31.599> racist<00:02:32.160> but<00:02:32.319> have<00:02:32.640> serious even racist but have serious
153.1
even racist but have serious consequences<00:02:33.920> on<00:02:34.080> people's<00:02:34.560> lives consequences on people’s lives
155.4
consequences on people’s lives so<00:02:35.519> i<00:02:35.599> want<00:02:35.680> to<00:02:35.760> give<00:02:36.000> you<00:02:36.319> one<00:02:36.640> example<00:02:38.080> in so i want to give you one example in
158.3
so i want to give you one example in january<00:02:38.879> 2020 january 2020
160.2
january 2020 robert<00:02:40.640> julian<00:02:41.920> when<00:02:42.160> i<00:02:42.319> read<00:02:42.480> the<00:02:42.640> name robert julian when i read the name
163
robert julian when i read the name robert<00:02:43.440> julian<00:02:43.920> bought<00:02:44.160> jack<00:02:44.480> williams robert julian bought jack williams
165.4
robert julian bought jack williams was<00:02:45.840> handcuffed<00:02:46.560> on<00:02:46.720> his<00:02:46.879> driveway<00:02:47.599> in<00:02:47.840> front was handcuffed on his driveway in front
168.1
was handcuffed on his driveway in front of<00:02:48.239> his<00:02:48.400> wife<00:02:48.640> and<00:02:48.840> children of his wife and children
170.1
of his wife and children taken<00:02:50.480> off<00:02:50.640> to<00:02:50.879> prison<00:02:52.720> and taken off to prison and
173.4
taken off to prison and on<00:02:53.680> suspicion<00:02:54.480> of<00:02:54.720> stealing<00:02:55.360> five<00:02:55.840> watches on suspicion of stealing five watches
176.6
on suspicion of stealing five watches from<00:02:56.800> a<00:02:56.879> store<00:02:57.200> in<00:02:57.360> detroit from a store in detroit
179.7
from a store in detroit now<00:03:00.400> this<00:03:00.640> suspicion<00:03:01.599> and<00:03:01.680> he<00:03:01.840> was<00:03:01.920> taken<00:03:02.159> to<00:03:02.319> a now this suspicion and he was taken to a
182.4
now this suspicion and he was taken to a detention<00:03:02.959> center<00:03:03.360> kept<00:03:03.599> there<00:03:03.760> overnight detention center kept there overnight
184.3
detention center kept there overnight and<00:03:04.400> kept<00:03:04.640> for<00:03:04.800> 30<00:03:05.280> hours and kept for 30 hours
186.4
and kept for 30 hours and<00:03:06.560> this<00:03:06.800> was<00:03:07.200> all<00:03:07.599> done<00:03:08.480> on<00:03:08.640> the<00:03:08.879> basis and this was all done on the basis
189.6
and this was all done on the basis of<00:03:09.760> a<00:03:09.920> face<00:03:10.400> recognition<00:03:11.360> algorithm<00:03:12.239> alone of a face recognition algorithm alone
194
of a face recognition algorithm alone and<00:03:14.239> it's<00:03:14.400> the<00:03:14.720> first<00:03:15.200> time<00:03:15.599> that<00:03:16.159> we<00:03:16.319> know<00:03:16.560> of and it’s the first time that we know of
196.6
and it’s the first time that we know of that<00:03:16.879> at<00:03:16.959> least<00:03:17.200> that's<00:03:17.360> been<00:03:17.599> reported that at least that’s been reported
198.8
that at least that’s been reported that<00:03:19.040> somebody<00:03:19.599> has<00:03:19.840> been<00:03:20.400> arrested<00:03:21.680> in<00:03:21.840> plain that somebody has been arrested in plain that somebody has been arrested in plain daylight daylight
203.3
daylight and<00:03:23.599> put<00:03:23.920> into<00:03:24.159> detention<00:03:24.879> for<00:03:25.120> 30<00:03:25.440> hours<00:03:26.239> on and put into detention for 30 hours on
206.3
and put into detention for 30 hours on the<00:03:26.480> basis<00:03:26.879> of<00:03:26.959> the<00:03:27.040> prediction<00:03:27.519> of<00:03:27.680> an the basis of the prediction of an
207.8
the basis of the prediction of an algorithm<00:03:28.480> alone algorithm alone
210.3
algorithm alone now<00:03:31.040> in<00:03:31.200> the<00:03:31.440> he<00:03:31.599> had<00:03:31.760> his<00:03:31.920> dna<00:03:32.400> taken<00:03:32.879> he<00:03:33.040> was now in the he had his dna taken he was
213.3
now in the he had his dna taken he was fingerprinted<00:03:34.159> he<00:03:34.319> was<00:03:34.560> questioned fingerprinted he was questioned
215.4
fingerprinted he was questioned he<00:03:35.760> on<00:03:36.080> all<00:03:36.319> of<00:03:36.400> this<00:03:36.720> it's<00:03:37.120> and two<00:03:39.599> weeks<00:03:39.840> later<00:03:40.239> he<00:03:40.480> had<00:03:40.560> to<00:03:40.640> go<00:03:40.799> and<00:03:40.959> appear two weeks later he had to go and appear
221.3
two weeks later he had to go and appear in<00:03:41.440> court<00:03:41.760> and<00:03:41.920> of<00:03:42.000> course<00:03:42.239> the<00:03:42.400> case<00:03:42.640> was in court and of course the case was
222.8
in court and of course the case was dropped<00:03:43.040> because<00:03:43.280> there<00:03:43.519> wasn't dropped because there wasn’t
224.6
dropped because there wasn’t any<00:03:45.120> evidence<00:03:47.200> um any evidence um
229
any evidence um and<00:03:49.120> you<00:03:49.280> probably<00:03:49.680> already<00:03:50.000> guessed<00:03:51.120> that<00:03:51.360> of and you probably already guessed that of and you probably already guessed that of course course
232.3
course mr<00:03:52.640> williams<00:03:53.360> was<00:03:53.599> black<00:03:55.280> right<00:03:55.680> and<00:03:55.920> sadly mr williams was black right and sadly
237.1
mr williams was black right and sadly that’s<00:03:57.519> not<00:03:57.760> surprising<00:03:59.439> so<00:03:59.599> how<00:03:59.840> did<00:04:00.000> this that’s not surprising so how did this
240.2
that’s not surprising so how did this happen<00:04:00.560> how<00:04:00.720> did<00:04:00.879> this<00:04:01.120> come<00:04:01.360> about happen how did this come about
242.5
happen how did this come about and<00:04:03.200> there's<00:04:03.599> lots<00:04:03.920> of<00:04:04.080> press<00:04:04.560> lots<00:04:04.799> in<00:04:04.959> the and there’s lots of press lots in the and there’s lots of press lots in the press press
245.6
press about<00:04:06.400> racism<00:04:06.879> and<00:04:07.040> algorithms<00:04:07.599> and<00:04:07.680> if<00:04:07.760> you about racism and algorithms and if you
247.8
about racism and algorithms and if you just<00:04:08.000> do<00:04:08.159> a<00:04:08.239> google<00:04:08.560> search<00:04:08.879> on<00:04:09.040> racist just do a google search on racist
249.5
just do a google search on racist algorithms<00:04:10.239> or<00:04:10.319> racist<00:04:10.799> artificial algorithms or racist artificial algorithms or racist artificial intelligence intelligence
252.5
intelligence you’ll<00:04:12.640> find<00:04:13.200> oodles<00:04:13.599> of<00:04:13.680> stuff<00:04:13.920> so<00:04:14.159> just<00:04:14.319> this you’ll find oodles of stuff so just this
254.6
you’ll find oodles of stuff so just this one<00:04:14.879> example<00:04:15.280> how<00:04:15.439> did<00:04:15.599> it<00:04:15.680> come<00:04:15.920> about one example how did it come about
257
one example how did it come about well<00:04:18.000> modern<00:04:19.600> algorithms<00:04:20.320> used well modern algorithms used
261.2
well modern algorithms used in<00:04:21.440> anger<00:04:21.759> and<00:04:21.919> industry<00:04:22.479> and<00:04:22.560> by<00:04:22.720> the<00:04:22.880> police in anger and industry and by the police
263.3
in anger and industry and by the police where<00:04:23.600> else where else
264.2
where else ev<00:04:24.560> everywhere<00:04:25.040> else<00:04:25.360> a<00:04:25.520> lot<00:04:25.919> many<00:04:26.160> of<00:04:26.240> them ev everywhere else a lot many of them
266.5
ev everywhere else a lot many of them are<00:04:26.560> based<00:04:26.880> on<00:04:27.040> artificial<00:04:27.520> intelligence are based on artificial intelligence
269.5
are based on artificial intelligence neural<00:04:29.919> nets<00:04:30.240> you<00:04:30.320> may<00:04:30.479> have<00:04:30.560> heard<00:04:30.880> of<00:04:31.040> these neural nets you may have heard of these
271.2
neural nets you may have heard of these things<00:04:31.600> are<00:04:31.759> trained<00:04:32.479> on things are trained on
273
things are trained on data<00:04:34.320> and<00:04:34.479> the<00:04:34.639> algorithm<00:04:35.199> is<00:04:35.440> only<00:04:35.759> as<00:04:36.000> good data and the algorithm is only as good
276.2
data and the algorithm is only as good as<00:04:36.400> the<00:04:36.639> data as the data
278
as the data and<00:04:38.240> it<00:04:38.400> turns<00:04:38.720> out<00:04:38.960> in<00:04:39.040> these<00:04:39.360> kind<00:04:39.520> of<00:04:39.680> so and it turns out in these kind of so
279.8
and it turns out in these kind of so what<00:04:40.000> happened<00:04:40.400> in<00:04:40.479> the<00:04:40.639> store what happened in the store
281.7
what happened in the store was<00:04:41.919> that<00:04:42.080> they<00:04:42.240> had<00:04:42.479> a<00:04:42.960> grainy<00:04:43.600> cctv was that they had a grainy cctv
285.1
was that they had a grainy cctv image<00:04:45.440> of<00:04:45.520> the<00:04:45.600> person<00:04:45.919> who<00:04:46.080> did<00:04:46.240> the<00:04:46.320> robbery image of the person who did the robbery
287.3
image of the person who did the robbery and<00:04:47.440> that<00:04:47.680> came<00:04:48.000> up<00:04:48.080> with<00:04:48.400> a<00:04:48.479> positive<00:04:49.120> match and that came up with a positive match
289.5
and that came up with a positive match in<00:04:49.680> inverted<00:04:50.160> commerce in inverted commerce
291
in inverted commerce with<00:04:51.440> mr<00:04:51.759> williams<00:04:53.280> now with mr williams now
294.3
with mr williams now there<00:04:54.560> are<00:04:54.720> more<00:04:55.520> white<00:04:55.919> faces<00:04:56.720> that<00:04:56.880> these there are more white faces that these
297.1
there are more white faces that these algorithms<00:04:57.759> are<00:04:57.840> trained<00:04:58.160> on algorithms are trained on
298.8
algorithms are trained on black<00:04:59.120> faces<00:05:00.639> so<00:05:01.120> they<00:05:01.600> do<00:05:01.840> a<00:05:02.080> better black faces so they do a better
302.6
black faces so they do a better job<00:05:03.840> of<00:05:04.080> recognizing<00:05:05.039> white<00:05:05.360> faces job of recognizing white faces
306
job of recognizing white faces than<00:05:06.160> they<00:05:06.320> do<00:05:06.560> with<00:05:06.639> black<00:05:07.039> faces<00:05:07.440> this<00:05:07.680> is than they do with black faces this is
308.4
than they do with black faces this is well<00:05:08.960> um well um
309.4
well um well<00:05:09.600> reported<00:05:10.320> in<00:05:10.639> in<00:05:10.720> the<00:05:10.880> press<00:05:11.360> and<00:05:11.600> in well reported in in the press and in
312.5
well reported in in the press and in academic<00:05:13.039> journals<00:05:14.720> so academic journals so
316.2
academic journals so and<00:05:16.400> up<00:05:16.560> to<00:05:16.800> a<00:05:16.880> hundred<00:05:17.360> times<00:05:17.919> even<00:05:18.160> it's<00:05:18.320> been and up to a hundred times even it’s been and up to a hundred times even it’s been reported reported
319.2
reported in<00:05:19.520> one<00:05:19.759> article<00:05:20.080> in<00:05:20.240> the<00:05:20.400> mit<00:05:20.880> review<00:05:22.080> then in one article in the mit review then
323
in one article in the mit review then not<00:05:23.120> just<00:05:23.280> black<00:05:23.520> people<00:05:23.759> but<00:05:24.000> people<00:05:24.320> of not just black people but people of
324.5
not just black people but people of color<00:05:25.520> so color so
327.1
color so this<00:05:27.360> stuff<00:05:28.000> you<00:05:28.160> know<00:05:28.320> and<00:05:28.400> what<00:05:28.639> is<00:05:28.800> more this stuff you know and what is more this stuff you know and what is more worrying worrying
331
worrying is<00:05:31.039> that<00:05:31.280> similar<00:05:31.680> algorithms<00:05:32.400> are<00:05:32.479> being is that similar algorithms are being
332.7
is that similar algorithms are being used<00:05:32.960> for<00:05:33.120> predictive<00:05:33.520> policing used for predictive policing
334.6
used for predictive policing there’s<00:05:35.039> a<00:05:35.199> major<00:05:35.600> study<00:05:36.080> in<00:05:36.240> the<00:05:36.320> nyu<00:05:37.280> law there’s a major study in the nyu law there’s a major study in the nyu law review review
339.3
review called<00:05:39.759> um<00:05:40.160> let's<00:05:40.320> see<00:05:40.479> i'm<00:05:40.560> going<00:05:40.720> to<00:05:40.800> hear called um let’s see i’m going to hear
341
called um let’s see i’m going to hear dirty<00:05:41.440> data<00:05:42.000> bad dirty data bad
342.4
dirty data bad predictions<00:05:43.840> saying<00:05:44.160> that<00:05:44.639> um<00:05:45.039> similar predictions saying that um similar
345.5
predictions saying that um similar algorithms<00:05:46.000> that algorithms that
346.4
algorithms that predict<00:05:47.199> the<00:05:47.440> likelihood<00:05:48.320> of<00:05:48.479> somebody<00:05:48.880> to<00:05:49.039> go predict the likelihood of somebody to go predict the likelihood of somebody to go off off
349.6
off and<00:05:49.919> offend<00:05:50.320> again<00:05:51.440> are<00:05:51.600> based<00:05:52.160> on<00:05:52.560> police and offend again are based on police and offend again are based on police records records
354.6
records and<00:05:55.680> in<00:05:55.919> many<00:05:56.240> cases<00:05:56.800> they<00:05:56.960> have<00:05:57.120> found and in many cases they have found
357.5
and in many cases they have found evidence<00:05:57.919> that<00:05:58.160> these evidence that these
359
evidence that these records<00:05:59.600> themselves<00:06:00.080> have<00:06:00.400> been records themselves have been records themselves have been systematically systematically
362.8
systematically um<00:06:03.360> tampered<00:06:03.840> with<00:06:04.000> they're<00:06:04.240> corrupt<00:06:04.800> they're um tampered with they’re corrupt they’re
365.1
um tampered with they’re corrupt they’re unlawful<00:06:05.919> they're<00:06:06.240> dirty<00:06:06.800> if<00:06:06.960> you<00:06:07.120> like unlawful they’re dirty if you like
368.6
unlawful they’re dirty if you like and<00:06:09.199> so<00:06:09.759> what<00:06:10.080> then<00:06:10.479> the<00:06:10.639> predictive and so what then the predictive
371.2
and so what then the predictive algorithm<00:06:11.680> will<00:06:11.840> do<00:06:12.000> of<00:06:12.080> course<00:06:12.400> is algorithm will do of course is
372.6
algorithm will do of course is faithfully<00:06:15.199> mimic faithfully mimic
375.7
faithfully mimic imitate<00:06:16.319> copy<00:06:17.360> the<00:06:17.600> behavior<00:06:18.479> of imitate copy the behavior of
379
imitate copy the behavior of corrupt<00:06:19.840> police<00:06:20.319> officers<00:06:21.520> and<00:06:21.759> for<00:06:22.000> example corrupt police officers and for example
382.4
corrupt police officers and for example you<00:06:22.560> imagine<00:06:23.039> a<00:06:23.199> judge<00:06:23.520> deciding<00:06:24.080> whether<00:06:24.319> to you imagine a judge deciding whether to
384.5
you imagine a judge deciding whether to give<00:06:24.720> bail<00:06:25.039> or<00:06:25.280> not give bail or not
386
give bail or not they’ll<00:06:26.240> use<00:06:26.479> a<00:06:26.560> predictive<00:06:27.039> algorithm<00:06:27.600> like they’ll use a predictive algorithm like
387.8
they’ll use a predictive algorithm like that<00:06:28.639> and<00:06:28.880> it<00:06:29.039> will<00:06:29.280> be that and it will be
390.6
that and it will be racist<00:06:31.360> in<00:06:31.520> its<00:06:31.759> effect<00:06:32.240> and<00:06:32.319> this<00:06:32.560> will racist in its effect and this will
392.8
racist in its effect and this will affect<00:06:33.199> people's<00:06:33.600> lives affect people’s lives
394.6
affect people’s lives um<00:06:34.800> they<00:06:34.960> found<00:06:35.360> examples<00:06:35.759> of<00:06:35.840> this<00:06:36.080> i<00:06:36.240> think um they found examples of this i think
396.6
um they found examples of this i think in<00:06:37.039> chicago in chicago
398.1
in chicago uh<00:06:38.400> new<00:06:38.639> orleans<00:06:39.199> and<00:06:39.360> maricopa<00:06:40.000> county<00:06:40.960> in uh new orleans and maricopa county in uh new orleans and maricopa county in arizona arizona
402.9
arizona so<00:06:43.360> um<00:06:44.240> and<00:06:44.400> there's<00:06:44.720> as<00:06:44.880> i<00:06:44.960> said<00:06:45.120> there's<00:06:45.440> many so um and there’s as i said there’s many
405.7
so um and there’s as i said there’s many many<00:06:45.919> more<00:06:46.240> examples<00:06:46.960> um<00:06:47.840> and many more examples um and
408.8
many more examples um and well<00:06:48.960> what<00:06:49.199> can<00:06:49.360> be<00:06:49.599> done<00:06:50.080> that's<00:06:50.319> the<00:06:50.400> key well what can be done that’s the key
410.6
well what can be done that’s the key question<00:06:51.120> that's<00:06:51.280> where<00:06:51.520> you<00:06:51.759> started<00:06:52.240> yeah question that’s where you started yeah
412.7
question that’s where you started yeah what<00:06:52.960> can<00:06:53.120> be<00:06:53.280> done<00:06:53.599> about<00:06:53.919> this<00:06:54.880> a<00:06:55.039> lot<00:06:55.199> more what can be done about this a lot more
415.4
what can be done about this a lot more research<00:06:55.840> needs<00:06:56.000> to<00:06:56.080> be<00:06:56.240> done research needs to be done
416.6
research needs to be done on<00:06:57.120> the<00:06:57.360> mathematical<00:06:58.000> science<00:06:58.400> end<00:06:58.560> of on the mathematical science end of
418.6
on the mathematical science end of things<00:06:58.960> what<00:06:59.120> can<00:06:59.280> we<00:06:59.440> do things what can we do
419.8
things what can we do in<00:06:59.919> order<00:07:00.240> to<00:07:00.400> make<00:07:00.639> sure<00:07:00.960> that<00:07:01.520> algorithms in order to make sure that algorithms
422.2
in order to make sure that algorithms aren’t<00:07:02.560> biased um<00:07:05.520> but<00:07:05.759> ultimately<00:07:06.319> it's<00:07:06.560> not<00:07:06.960> just<00:07:07.360> about um but ultimately it’s not just about um but ultimately it’s not just about people people
428.4
people and<00:07:08.560> not<00:07:08.720> just<00:07:08.880> about<00:07:09.199> techies<00:07:09.680> i'm<00:07:09.759> sorry<00:07:10.000> i and not just about techies i’m sorry i
430.1
and not just about techies i’m sorry i meant<00:07:10.240> to<00:07:10.400> say<00:07:10.720> it's<00:07:10.880> not<00:07:11.120> just<00:07:11.360> about<00:07:11.599> the meant to say it’s not just about the
431.8
meant to say it’s not just about the techies<00:07:12.240> who<00:07:12.479> make<00:07:12.720> the<00:07:12.880> algorithms<00:07:13.520> about techies who make the algorithms about techies who make the algorithms about people people
435
people it’s<00:07:15.199> about<00:07:15.360> the<00:07:15.520> people<00:07:15.919> who<00:07:16.240> gather<00:07:16.560> the it’s about the people who gather the it’s about the people who gather the data data
438.6
data who<00:07:19.280> you<00:07:19.440> know<00:07:19.919> um<00:07:20.479> vet<00:07:20.800> the<00:07:20.960> data<00:07:21.440> for<00:07:21.759> how who you know um vet the data for how
442.9
who you know um vet the data for how you<00:07:22.960> know<00:07:23.280> good<00:07:23.520> it<00:07:23.680> is<00:07:24.319> it's<00:07:24.479> about<00:07:24.800> people you know good it is it’s about people
445.1
you know good it is it’s about people who<00:07:25.280> then<00:07:25.680> use<00:07:26.080> the<00:07:26.319> algorithms who then use the algorithms
447.4
who then use the algorithms when<00:07:27.599> they're<00:07:28.000> deployed<00:07:28.880> and<00:07:29.120> in<00:07:29.360> situ when they’re deployed and in situ
450.9
when they’re deployed and in situ and<00:07:31.120> so<00:07:32.080> what<00:07:32.319> needs<00:07:32.720> to<00:07:32.880> be<00:07:32.960> done<00:07:33.280> is<00:07:33.840> is and so what needs to be done is is
454.3
and so what needs to be done is is partly<00:07:34.720> at<00:07:34.800> the<00:07:34.960> mathematical<00:07:35.599> sciences<00:07:36.160> end partly at the mathematical sciences end
456.8
partly at the mathematical sciences end but<00:07:36.960> partly<00:07:37.360> you're<00:07:37.599> talking<00:07:37.919> about<00:07:38.319> a<00:07:38.560> global but partly you’re talking about a global
459.6
but partly you’re talking about a global kind<00:07:39.840> of<00:07:40.080> effort<00:07:41.199> in<00:07:41.599> standards kind of effort in standards
462.7
kind of effort in standards maybe<00:07:43.120> ethical<00:07:43.680> reviews<00:07:44.960> and<00:07:45.440> coupled<00:07:45.919> with maybe ethical reviews and coupled with maybe ethical reviews and coupled with that that
466.7
that a<00:07:47.360> massive<00:07:47.919> mindset<00:07:48.639> shift<00:07:49.440> in<00:07:49.599> terms<00:07:49.840> of<00:07:50.000> how a massive mindset shift in terms of how
470.2
a massive mindset shift in terms of how we<00:07:50.319> approach<00:07:50.720> this<00:07:50.960> area we approach this area
471.9
we approach this area again<00:07:52.240> i<00:07:52.319> made<00:07:52.560> a<00:07:52.639> list<00:07:52.879> of<00:07:53.039> loan<00:07:53.360> mortgage again i made a list of loan mortgage
473.8
again i made a list of loan mortgage applications<00:07:54.639> credit<00:07:54.960> risk<00:07:55.280> rating<00:07:55.759> fraud applications credit risk rating fraud applications credit risk rating fraud detection detection
477.7
detection granting<00:07:58.080> a<00:07:58.160> bear<00:07:58.400> we<00:07:58.560> touched<00:07:58.800> on<00:07:59.039> exam<00:07:59.440> break granting a bear we touched on exam break
479.8
granting a bear we touched on exam break prediction<00:08:00.479> you prediction you
480.9
prediction you may<00:08:01.199> remember<00:08:01.599> last<00:08:01.919> summer<00:08:02.639> it<00:08:02.720> was<00:08:02.879> that<00:08:03.039> big may remember last summer it was that big
483.3
may remember last summer it was that big fewer<00:08:03.599> and<00:08:03.680> that<00:08:03.840> was<00:08:03.919> just<00:08:04.160> a<00:08:04.240> simple fewer and that was just a simple
485.3
fewer and that was just a simple linear<00:08:05.680> mathematical<00:08:06.319> equation<00:08:06.800> no<00:08:06.960> clever linear mathematical equation no clever
487.4
linear mathematical equation no clever neural<00:08:07.680> nets<00:08:07.919> or<00:08:08.000> anything neural nets or anything
488.8
neural nets or anything insurance<00:08:09.360> premiums<00:08:09.919> job<00:08:10.240> application insurance premiums job application
490.9
insurance premiums job application streaming<00:08:11.680> screening streaming screening
492.5
streaming screening housing<00:08:13.199> benefits<00:08:13.840> medical<00:08:14.319> diagnosis housing benefits medical diagnosis
495
housing benefits medical diagnosis algorithms<00:08:15.520> border<00:08:15.919> security algorithms border security
497.2
algorithms border security very<00:08:17.440> interesting<00:08:17.919> example<00:08:18.400> a<00:08:18.479> campaigning very interesting example a campaigning
499.1
very interesting example a campaigning group<00:08:19.360> fox<00:08:19.680> club<00:08:20.000> working<00:08:20.240> with<00:08:20.479> the<00:08:20.560> joint group fox club working with the joint group fox club working with the joint council council
501.5
council for<00:08:21.680> the<00:08:21.759> welfare<00:08:22.160> of<00:08:22.319> immigrants<00:08:23.199> took<00:08:23.440> the for the welfare of immigrants took the
503.7
for the welfare of immigrants took the home<00:08:24.080> office home office
505.6
home office to<00:08:25.840> judicial<00:08:26.479> review<00:08:27.680> on<00:08:27.919> an<00:08:28.160> algorithm<00:08:28.720> that to judicial review on an algorithm that
508.8
to judicial review on an algorithm that they<00:08:28.960> were<00:08:29.199> using<00:08:29.599> that<00:08:29.759> they<00:08:30.000> claimed<00:08:30.400> was they were using that they claimed was
510.6
they were using that they claimed was racist<00:08:31.280> and<00:08:31.520> they racist and they
511.8
racist and they won<00:08:32.560> a<00:08:32.719> judicial<00:08:33.279> review<00:08:33.760> against<00:08:34.159> the<00:08:34.719> home won a judicial review against the home
515
won a judicial review against the home office<00:08:35.360> that's<00:08:35.599> a<00:08:35.680> major<00:08:36.080> thing office that’s a major thing
516.7
office that’s a major thing border<00:08:37.120> security<00:08:37.680> i<00:08:37.839> was<00:08:38.159> intruder<00:08:38.719> alarm border security i was intruder alarm
519.2
border security i was intruder alarm automated<00:08:39.839> news<00:08:40.159> feeds<00:08:40.479> in<00:08:40.560> summaries<00:08:40.959> i<00:08:41.039> mean automated news feeds in summaries i mean
522.3
automated news feeds in summaries i mean it’s<00:08:43.279> it's<00:08:43.519> a<00:08:43.680> long<00:08:43.919> long<00:08:44.240> long<00:08:44.480> list there’s<00:08:47.920> two<00:08:48.080> things<00:08:48.320> i<00:08:48.399> want<00:08:48.640> to<00:08:48.720> sell<00:08:48.959> to there’s two things i want to sell to
529
there’s two things i want to sell to give<00:08:49.200> an<00:08:49.360> example<00:08:49.760> but<00:08:49.920> i<00:08:50.000> also<00:08:50.240> want<00:08:50.399> to<00:08:50.480> give give an example but i also want to give
530.7
give an example but i also want to give a<00:08:50.880> ray<00:08:51.120> of<00:08:51.279> hope a ray of hope
532.2
a ray of hope that<00:08:52.640> actually<00:08:53.360> in<00:08:53.519> some<00:08:53.760> strange<00:08:54.480> way that actually in some strange way
535.5
that actually in some strange way if<00:08:55.760> the<00:08:55.920> racism<00:08:56.640> is<00:08:56.800> externalized<00:08:57.920> in<00:08:58.080> an if the racism is externalized in an if the racism is externalized in an algorithm algorithm
540
algorithm it<00:09:00.399> might<00:09:00.800> be<00:09:01.040> easier<00:09:01.519> for<00:09:01.760> humans<00:09:02.240> to<00:09:02.560> deal it might be easier for humans to deal it might be easier for humans to deal with with
543.8
with than<00:09:04.080> if<00:09:04.240> it's<00:09:04.640> internal<00:09:05.360> within<00:09:05.760> a<00:09:05.839> human than if it’s internal within a human
546.8
than if it’s internal within a human being<00:09:07.839> but being but
548.2
being but you<00:09:08.320> can<00:09:08.480> say<00:09:08.720> it's<00:09:08.880> not<00:09:09.120> me<00:09:09.360> gov<00:09:10.080> look<00:09:10.399> it's you can say it’s not me gov look it’s
550.6
you can say it’s not me gov look it’s this<00:09:10.880> algorithm<00:09:11.519> is<00:09:11.680> racist this algorithm is racist
552.7
this algorithm is racist okay<00:09:13.040> let's<00:09:13.360> fix<00:09:13.680> the<00:09:13.839> algorithm<00:09:14.480> then<00:09:15.279> right okay let’s fix the algorithm then right
556.4
okay let’s fix the algorithm then right the<00:09:16.640> second<00:09:16.959> thing<00:09:17.120> i<00:09:17.200> wanted<00:09:17.519> to<00:09:17.680> say<00:09:17.839> is<00:09:18.000> that the second thing i wanted to say is that
558.2
the second thing i wanted to say is that the<00:09:18.320> racism<00:09:18.959> doesn't<00:09:19.279> just<00:09:19.600> happen<00:09:20.000> at<00:09:20.160> a the racism doesn’t just happen at a
560.3
the racism doesn’t just happen at a conscious<00:09:20.959> level conscious level
561.6
conscious level it<00:09:21.760> can<00:09:22.000> happen<00:09:22.600> subconsciously<00:09:24.160> so<00:09:24.399> there's it can happen subconsciously so there’s
564.6
it can happen subconsciously so there’s an<00:09:24.800> example<00:09:25.279> with<00:09:25.519> twitter an example with twitter
566.7
an example with twitter that<00:09:27.040> it<00:09:27.440> um<00:09:28.160> uh<00:09:28.880> it<00:09:29.040> deployed that it um uh it deployed
570
that it um uh it deployed a<00:09:30.720> an<00:09:30.880> algorithm<00:09:32.080> that<00:09:32.399> in<00:09:32.640> a<00:09:32.880> in<00:09:33.040> an<00:09:33.200> image a an algorithm that in a in an image a an algorithm that in a in an image with with
574
with a<00:09:34.080> bunch<00:09:34.399> of<00:09:34.560> faces<00:09:35.040> in<00:09:35.200> it<00:09:35.440> it<00:09:35.519> would a bunch of faces in it it would
575.8
a bunch of faces in it it would automatically<00:09:36.640> zoom<00:09:36.959> in automatically zoom in
577.4
automatically zoom in on<00:09:37.600> the<00:09:38.160> most<00:09:38.480> important<00:09:39.120> phase<00:09:40.240> and on the most important phase and
580.6
on the most important phase and use<00:09:40.800> that<00:09:41.040> that's<00:09:41.360> for<00:09:41.519> example<00:09:41.920> if<00:09:42.080> you use that that’s for example if you
582.2
use that that’s for example if you wanted<00:09:42.560> a<00:09:43.040> an<00:09:43.200> image<00:09:43.519> for<00:09:43.760> your wanted a an image for your
584.6
wanted a an image for your own<00:09:45.040> home<00:09:45.440> screen<00:09:46.000> this<00:09:46.240> is<00:09:46.399> the<00:09:46.480> image<00:09:46.800> you own home screen this is the image you own home screen this is the image you wanted wanted
588.2
wanted and<00:09:48.800> what<00:09:49.120> one<00:09:49.760> uh<00:09:50.160> twitter<00:09:50.560> user<00:09:50.959> actually<00:09:51.200> a and what one uh twitter user actually a
591.3
and what one uh twitter user actually a complete<00:09:51.680> random<00:09:52.080> person<00:09:52.560> found complete random person found
593.4
complete random person found was<00:09:53.600> that<00:09:53.839> it<00:09:53.920> was<00:09:54.240> systematically<00:09:55.120> biased was that it was systematically biased
595.5
was that it was systematically biased towards<00:09:55.920> white<00:09:56.240> people<00:09:56.480> it<00:09:56.560> would<00:09:56.800> pick<00:09:57.120> out towards white people it would pick out
597.7
towards white people it would pick out white<00:09:58.000> people<00:09:59.360> and<00:09:59.440> they<00:09:59.600> did<00:09:59.839> an<00:10:00.000> excellent white people and they did an excellent white people and they did an excellent example example
601.1
example of<00:10:01.440> um<00:10:02.240> put<00:10:02.399> together<00:10:02.800> a<00:10:03.120> joint<00:10:03.519> image<00:10:03.760> of of um put together a joint image of of um put together a joint image of obama obama
605
obama and<00:10:05.200> mitch<00:10:05.440> mcconnell<00:10:07.040> and<00:10:07.440> mitch<00:10:07.680> mcconnell and mitch mcconnell and mitch mcconnell
608.1
and mitch mcconnell and mitch mcconnell who’s<00:10:08.240> the<00:10:08.399> leader<00:10:08.640> of<00:10:08.720> the who’s the leader of the
609
who’s the leader of the the<00:10:09.200> senate<00:10:09.760> the<00:10:09.920> republican<00:10:10.560> leader<00:10:10.880> in<00:10:10.959> the the senate the republican leader in the
611
the senate the republican leader in the senate<00:10:11.440> and senate and
612.1
senate and the<00:10:12.240> algorithm<00:10:13.040> invariably<00:10:14.000> picked<00:10:14.320> mitch the algorithm invariably picked mitch
614.6
the algorithm invariably picked mitch mcconnell<00:10:15.120> who's<00:10:15.360> white<00:10:15.760> over mcconnell who’s white over
616.6
mcconnell who’s white over president<00:10:17.040> obama<00:10:18.720> um president obama um
619.8
president obama um and<00:10:20.320> twitter<00:10:21.519> twitter<00:10:21.920> said<00:10:22.320> that<00:10:22.480> they and twitter twitter said that they
622.9
and twitter twitter said that they had<00:10:23.519> vetted<00:10:23.920> this<00:10:24.240> algorithm<00:10:25.200> for<00:10:25.680> racial had vetted this algorithm for racial had vetted this algorithm for racial buyers buyers
627.6
buyers and<00:10:27.760> they<00:10:27.920> hadn't<00:10:28.320> found<00:10:28.640> this<00:10:28.959> effect<00:10:29.839> and and they hadn’t found this effect and
630
and they hadn’t found this effect and they’ve<00:10:30.160> been<00:10:30.320> very<00:10:30.640> open<00:10:30.959> about<00:10:31.279> it<00:10:31.440> they've they’ve been very open about it they’ve they’ve been very open about it they’ve taken taken
632
taken it<00:10:32.160> away<00:10:32.480> they're<00:10:32.640> going<00:10:32.800> to<00:10:32.959> look<00:10:33.200> at<00:10:33.360> it it away they’re going to look at it
633.5
it away they’re going to look at it carefully<00:10:33.920> they're<00:10:34.079> going<00:10:34.160> to<00:10:34.320> put<00:10:34.560> all<00:10:34.720> this carefully they’re going to put all this
635
carefully they’re going to put all this out<00:10:35.760> on<00:10:36.320> the<00:10:36.720> um<00:10:37.120> on<00:10:37.200> the<00:10:37.360> internet<00:10:37.839> all<00:10:38.000> the out on the um on the internet all the
638.1
out on the um on the internet all the results<00:10:38.399> it's<00:10:38.560> just<00:10:38.800> a<00:10:38.880> couple<00:10:39.120> months<00:10:39.440> ago results it’s just a couple months ago
639.7
results it’s just a couple months ago this<00:10:39.839> happened this happened
641.3
this happened um<00:10:42.320> but<00:10:42.480> the<00:10:42.800> second<00:10:43.120> thing<00:10:43.360> that's um but the second thing that’s
643.7
um but the second thing that’s interesting<00:10:44.079> about<00:10:44.320> this<00:10:44.640> is<00:10:44.880> the<00:10:45.200> reason<00:10:45.680> it interesting about this is the reason it
645.8
interesting about this is the reason it came<00:10:46.240> about came about
647.2
came about is<00:10:47.360> because<00:10:47.680> they<00:10:48.079> based<00:10:48.880> their<00:10:49.519> um is because they based their um
650.6
is because they based their um their<00:10:50.880> algorithm<00:10:52.000> on<00:10:52.240> the<00:10:52.480> findings<00:10:53.200> of<00:10:53.360> the their algorithm on the findings of the
653.5
their algorithm on the findings of the psychological<00:10:54.320> process psychological process
655
psychological process um<00:10:55.360> called<00:10:55.680> eyeball<00:10:56.160> tracking<00:10:57.200> where<00:10:57.360> you um called eyeball tracking where you
657.5
um called eyeball tracking where you present<00:10:57.920> two<00:10:58.079> images<00:10:58.560> to<00:10:58.839> a present two images to a
660.3
present two images to a somebody<00:11:00.640> in<00:11:00.720> your<00:11:00.880> laboratory<00:11:02.079> and<00:11:02.160> you<00:11:02.399> see somebody in your laboratory and you see
662.8
somebody in your laboratory and you see where<00:11:03.040> their<00:11:03.279> eyeballs<00:11:03.920> focus<00:11:04.320> and where their eyeballs focus and
664.6
where their eyeballs focus and there’s<00:11:04.880> you<00:11:05.040> know<00:11:05.120> you<00:11:05.279> can<00:11:05.440> with<00:11:05.680> these there’s you know you can with these
665.8
there’s you know you can with these cameras<00:11:06.240> you<00:11:06.399> can<00:11:06.480> check<00:11:06.720> that<00:11:06.959> on<00:11:07.040> the<00:11:07.200> screen cameras you can check that on the screen
668.3
cameras you can check that on the screen and<00:11:09.320> subconsciously<00:11:10.480> and<00:11:10.560> this<00:11:10.800> is<00:11:10.880> the<00:11:11.040> issue and subconsciously and this is the issue
671.4
and subconsciously and this is the issue this<00:11:11.519> is<00:11:11.680> subconscious<00:11:12.839> processes this is subconscious processes
674.3
this is subconscious processes people<00:11:15.200> will<00:11:15.680> tend<00:11:16.560> to<00:11:16.720> gravitate people will tend to gravitate
677.6
people will tend to gravitate towards<00:11:18.160> white<00:11:18.560> faces towards white faces
680.6
towards white faces and<00:11:20.880> then<00:11:21.040> we<00:11:21.200> have<00:11:21.360> to<00:11:21.519> ask<00:11:21.920> why<00:11:22.160> is<00:11:22.240> that and then we have to ask why is that
682.4
and then we have to ask why is that happening<00:11:22.880> and<00:11:22.959> you're<00:11:23.120> talking<00:11:23.440> about happening and you’re talking about
683.7
happening and you’re talking about unpicking<00:11:24.240> a<00:11:24.320> whole<00:11:24.640> legacy unpicking a whole legacy
685.4
unpicking a whole legacy of<00:11:25.920> decades<00:11:26.640> of<00:11:26.839> television<00:11:28.160> film of decades of television film
689.3
of decades of television film advertising<00:11:30.240> images random<00:11:33.760> a<00:11:33.839> random<00:11:34.240> aside<00:11:34.640> for<00:11:34.880> example<00:11:35.200> did random a random aside for example did
695.4
random a random aside for example did you<00:11:35.519> know<00:11:35.680> that<00:11:35.920> a<00:11:36.000> quarter<00:11:36.399> of<00:11:36.560> cowboys<00:11:37.120> were black i<00:11:45.040> don't<00:11:45.200> want<00:11:45.360> to<00:11:45.440> get<00:11:45.680> into<00:11:45.920> hair<00:11:46.160> splitting i don’t want to get into hair splitting
706.7
i don’t want to get into hair splitting i<00:11:46.800> mean<00:11:47.120> i<00:11:47.200> think<00:11:47.440> what<00:11:47.680> people<00:11:47.920> are<00:11:48.079> doing<00:11:48.399> in i mean i think what people are doing in
708.7
i mean i think what people are doing in decolonization<00:11:49.760> is<00:11:49.920> great<00:11:50.320> and<00:11:50.480> i<00:11:50.639> support decolonization is great and i support
711.1
decolonization is great and i support all<00:11:51.200> their<00:11:51.519> efforts<00:11:51.920> that<00:11:52.000> they're<00:11:52.240> doing<00:11:53.120> and all their efforts that they’re doing and
713.4
all their efforts that they’re doing and what<00:11:53.600> people<00:11:53.839> are<00:11:53.920> doing what people are doing
714.3
what people are doing anti-racism<00:11:55.200> is<00:11:55.360> great<00:11:55.760> and<00:11:55.920> i<00:11:56.000> support<00:11:56.480> all anti-racism is great and i support all
716.6
anti-racism is great and i support all the<00:11:56.720> efforts<00:11:57.120> they're<00:11:57.440> doing the efforts they’re doing
718.6
the efforts they’re doing and<00:11:59.279> you<00:11:59.440> know<00:11:59.600> i<00:11:59.680> think<00:11:59.839> we<00:12:00.000> can<00:12:00.399> sometimes and you know i think we can sometimes
721.5
and you know i think we can sometimes you<00:12:01.680> know<00:12:01.839> we<00:12:01.920> can<00:12:02.079> get<00:12:02.240> go<00:12:02.480> fall<00:12:02.720> down<00:12:02.959> rabbit you know we can get go fall down rabbit
723.4
you know we can get go fall down rabbit holes<00:12:03.760> by<00:12:03.920> worrying<00:12:04.240> about<00:12:04.399> precisely<00:12:04.959> what holes by worrying about precisely what
725.1
holes by worrying about precisely what one<00:12:05.360> term<00:12:05.600> means<00:12:05.920> or<00:12:06.000> another<00:12:06.320> term<00:12:06.560> means one term means or another term means
727.5
one term means or another term means and<00:12:07.839> you<00:12:07.920> know<00:12:08.079> there<00:12:08.320> are<00:12:08.639> issues<00:12:09.200> involved and you know there are issues involved
729.7
and you know there are issues involved but<00:12:10.000> it's but it’s
731.2
but it’s really<00:12:11.519> what's<00:12:11.760> it<00:12:12.079> at<00:12:12.160> the<00:12:12.320> heart<00:12:12.639> of<00:12:12.720> the really what’s it at the heart of the
732.9
really what’s it at the heart of the matter<00:12:13.279> is<00:12:13.440> what<00:12:13.600> are<00:12:13.600> you<00:12:13.680> going<00:12:13.839> to<00:12:13.920> do<00:12:14.160> about matter is what are you going to do about matter is what are you going to do about it i’ll<00:12:17.839> second<00:12:18.320> start<00:12:18.639> with<00:12:18.800> the<00:12:18.880> second<00:12:19.279> part i’ll second start with the second part
739.6
i’ll second start with the second part because<00:12:19.839> there's<00:12:20.160> lots<00:12:20.399> of<00:12:20.560> people<00:12:20.880> working because there’s lots of people working
741.2
because there’s lots of people working in<00:12:21.360> the<00:12:21.519> area<00:12:21.920> and<00:12:22.000> that's<00:12:22.320> great<00:12:22.720> to<00:12:22.880> see in the area and that’s great to see
744
in the area and that’s great to see there’s<00:12:24.320> i<00:12:24.480> mean<00:12:24.639> again<00:12:24.880> i've<00:12:24.959> got<00:12:25.120> a<00:12:25.200> list there’s i mean again i’ve got a list
745.5
there’s i mean again i’ve got a list here<00:12:25.760> of<00:12:26.000> groups<00:12:26.480> involved<00:12:26.959> for<00:12:27.200> example here of groups involved for example
748.2
here of groups involved for example data<00:12:28.560> for<00:12:28.800> black<00:12:29.200> lives<00:12:30.160> ai<00:12:30.720> now<00:12:31.120> institute data for black lives ai now institute
751.8
data for black lives ai now institute black<00:12:32.160> in<00:12:32.399> computing<00:12:33.040> black<00:12:33.440> and<00:12:33.680> a<00:12:34.000> i black in computing black and a i
754.9
black in computing black and a i robo<00:12:35.360> hub<00:12:35.760> people<00:12:36.560> the<00:12:36.800> radical<00:12:37.360> ai<00:12:37.920> group robo hub people the radical ai group
759.1
robo hub people the radical ai group um<00:12:39.600> and<00:12:39.680> these<00:12:39.920> all<00:12:40.079> come<00:12:40.320> from<00:12:40.480> the<00:12:40.639> technical um and these all come from the technical
761.2
um and these all come from the technical side<00:12:41.600> and<00:12:41.839> there's<00:12:42.079> also<00:12:42.399> a<00:12:42.399> lot<00:12:42.639> of side and there’s also a lot of
763.1
side and there’s also a lot of newspaper<00:12:43.760> reporters<00:12:44.480> and<00:12:44.639> if<00:12:44.800> you<00:12:44.880> just<00:12:45.040> go newspaper reporters and if you just go
765.2
newspaper reporters and if you just go online<00:12:45.760> and online and
766.3
online and again<00:12:46.560> do<00:12:46.720> a<00:12:46.880> search<00:12:47.120> you'll<00:12:47.279> see<00:12:47.440> a<00:12:47.440> lot<00:12:47.600> of again do a search you’ll see a lot of
767.8
again do a search you’ll see a lot of newspaper<00:12:48.800> karen<00:12:49.200> how<00:12:49.440> it newspaper karen how it
769.8
newspaper karen how it um<00:12:50.240> mit<00:12:51.200> review<00:12:51.600> is<00:12:51.760> doing<00:12:52.079> great<00:12:52.320> stuff<00:12:52.720> there um mit review is doing great stuff there
773.6
um mit review is doing great stuff there and<00:12:53.760> the<00:12:53.839> new<00:12:54.000> york<00:12:54.240> times<00:12:54.639> brought<00:12:54.959> up<00:12:55.040> this and the new york times brought up this
775.3
and the new york times brought up this article<00:12:55.920> about<00:12:56.320> mr<00:12:56.639> williams article about mr williams
777.6
article about mr williams so<00:12:57.839> reporters<00:12:58.639> are<00:12:58.800> engaged<00:13:00.000> there's<00:13:00.399> also so reporters are engaged there’s also
781
so reporters are engaged there’s also campaign<00:13:01.600> groups<00:13:02.320> engaged<00:13:02.880> i<00:13:03.040> mentioned campaign groups engaged i mentioned campaign groups engaged i mentioned foxglove foxglove
784.3
foxglove and<00:13:04.800> joint<00:13:05.120> council<00:13:05.440> for<00:13:05.519> the<00:13:05.600> welfare<00:13:06.000> of<00:13:06.079> the and joint council for the welfare of the
786.2
and joint council for the welfare of the immigrants<00:13:06.959> but<00:13:07.120> there's immigrants but there’s
787.5
immigrants but there’s many<00:13:07.839> many<00:13:08.160> others<00:13:08.720> involved<00:13:09.279> in<00:13:09.360> this<00:13:09.680> in many many others involved in this in
789.8
many many others involved in this in terms<00:13:10.160> of terms of
791
terms of um<00:13:11.440> so<00:13:11.680> that's<00:13:12.000> good<00:13:12.639> um<00:13:13.760> probably<00:13:14.240> at<00:13:14.399> a um so that’s good um probably at a
794.5
um so that’s good um probably at a technical<00:13:15.040> level technical level
795.5
technical level i<00:13:15.680> think<00:13:15.839> within<00:13:16.160> the<00:13:16.320> mathematical<00:13:17.040> sciences i think within the mathematical sciences
797.5
i think within the mathematical sciences community<00:13:18.079> there's<00:13:18.480> more<00:13:18.720> that<00:13:18.959> needs<00:13:19.200> to<00:13:19.360> be community there’s more that needs to be
799.4
community there’s more that needs to be done<00:13:19.760> a<00:13:19.839> lot<00:13:20.160> more done a lot more
801.1
done a lot more and<00:13:21.279> i<00:13:21.360> know<00:13:21.600> that<00:13:21.839> your<00:13:22.079> colleague<00:13:22.480> eddie and i know that your colleague eddie
802.7
and i know that your colleague eddie wilson<00:13:23.360> in<00:13:23.519> engineering<00:13:24.000> maths wilson in engineering maths
804.6
wilson in engineering maths is<00:13:24.880> talking<00:13:25.200> about<00:13:26.560> doing<00:13:26.880> a<00:13:26.959> workshop is talking about doing a workshop
808.2
is talking about doing a workshop to<00:13:28.480> look<00:13:28.720> at<00:13:29.040> how<00:13:29.279> to<00:13:29.760> you<00:13:29.920> know<00:13:30.079> develop to look at how to you know develop
810.5
to look at how to you know develop mathematical<00:13:31.120> science<00:13:31.519> methods mathematical science methods
812.4
mathematical science methods to<00:13:32.720> rectify<00:13:33.600> and<00:13:34.000> well<00:13:34.160> first<00:13:34.480> to<00:13:34.720> spot to rectify and well first to spot
815.7
to rectify and well first to spot and<00:13:35.760> then<00:13:36.000> to<00:13:36.160> rectify<00:13:36.800> these<00:13:37.040> kinds<00:13:37.279> of<00:13:37.440> bias and then to rectify these kinds of bias
818
and then to rectify these kinds of bias in<00:13:38.320> uh in uh
818.7
in uh algorithms<00:13:40.320> um<00:13:41.199> so<00:13:41.360> yeah<00:13:41.600> as<00:13:41.760> for<00:13:41.920> myself algorithms um so yeah as for myself
822.6
algorithms um so yeah as for myself i<00:13:42.720> mean<00:13:42.880> i'm<00:13:43.040> retired<00:13:43.440> i've<00:13:43.600> been<00:13:43.680> retired<00:13:44.160> for i mean i’m retired i’ve been retired for
824.3
i mean i’m retired i’ve been retired for 10<00:13:44.560> years<00:13:45.440> so 10 years so 10 years so um<00:13:46.880> i'm<00:13:47.920> sort<00:13:48.160> of<00:13:48.639> i<00:13:48.800> talk<00:13:49.040> to<00:13:49.199> people um i’m sort of i talk to people
829.8
um i’m sort of i talk to people and<00:13:50.000> i'm<00:13:50.399> trying<00:13:50.639> to<00:13:50.880> influence<00:13:51.440> people<00:13:51.839> it's and i’m trying to influence people it’s
832.2
and i’m trying to influence people it’s it’s<00:13:52.399> what<00:13:52.639> our<00:13:52.880> hr<00:13:53.279> department<00:13:53.839> at<00:13:53.920> unilever it’s what our hr department at unilever
834.5
it’s what our hr department at unilever used<00:13:54.639> to<00:13:54.720> call<00:13:54.959> influencing<00:13:55.600> through<00:13:55.839> others used to call influencing through others
836.9
used to call influencing through others and<00:13:57.120> so<00:13:57.279> i'm<00:13:57.440> not<00:13:57.680> actively<00:13:58.320> engaged<00:13:58.959> in and so i’m not actively engaged in
839.1
and so i’m not actively engaged in research<00:14:00.839> myself things<00:14:05.680> have<00:14:06.000> morphed<00:14:08.320> i<00:14:08.560> think<00:14:08.800> it's<00:14:09.040> strange things have morphed i think it’s strange
849.4
things have morphed i think it’s strange i<00:14:09.519> mean i mean
850.8
i mean there’s<00:14:11.199> more<00:14:11.600> racist<00:14:12.399> incidences<00:14:13.519> than there’s more racist incidences than
853.8
there’s more racist incidences than there<00:14:13.839> were<00:14:14.079> back<00:14:14.399> then there were back then
855.5
there were back then but<00:14:15.680> there<00:14:15.839> were<00:14:16.160> less<00:14:16.800> challenges<00:14:17.839> to<00:14:18.240> the but there were less challenges to the
858.7
but there were less challenges to the structures<00:14:19.360> of<00:14:19.519> racism<00:14:20.160> back<00:14:20.480> then<00:14:20.639> than structures of racism back then than
860.8
structures of racism back then than there<00:14:21.040> are<00:14:21.279> now there are now
862.6
there are now so<00:14:22.880> if<00:14:23.040> you<00:14:23.120> like<00:14:23.360> it's<00:14:23.519> all<00:14:23.680> being<00:14:24.079> talked so if you like it’s all being talked
864.4
so if you like it’s all being talked about<00:14:24.800> more<00:14:25.360> and about more and
866.1
about more and people<00:14:26.320> are<00:14:26.480> also<00:14:27.920> to<00:14:28.079> a<00:14:28.079> certain<00:14:28.480> extent people are also to a certain extent
869
people are also to a certain extent moving<00:14:29.360> towards<00:14:29.839> action<00:14:30.320> more moving towards action more
871.9
moving towards action more um<00:14:32.320> but<00:14:32.560> also<00:14:32.800> there's<00:14:33.360> a<00:14:33.519> greater um but also there’s a greater
875
um but also there’s a greater instance<00:14:35.360> i<00:14:35.519> went<00:14:35.680> to<00:14:35.839> bristol<00:14:36.240> university<00:14:36.720> i instance i went to bristol university i
877
instance i went to bristol university i never<00:14:37.279> encountered never encountered
878.2
never encountered any<00:14:38.720> overt<00:14:40.320> act<00:14:40.639> of<00:14:40.800> racism any overt act of racism
882.1
any overt act of racism but<00:14:42.320> there<00:14:43.360> there<00:14:43.600> are<00:14:43.920> now<00:14:44.639> there<00:14:44.959> are but there there are now there are
885.4
but there there are now there are over<00:14:46.000> acts<00:14:46.320> of<00:14:46.480> racism<00:14:47.680> on<00:14:47.920> british over acts of racism on british
888.4
over acts of racism on british university<00:14:48.880> campuses university campuses
890.2
university campuses so<00:14:51.040> some<00:14:51.279> things<00:14:51.519> have<00:14:51.680> changed<00:14:52.160> some<00:14:52.320> things so some things have changed some things
892.6
so some things have changed some things have<00:14:52.639> got<00:14:52.800> better<00:14:53.120> some<00:14:53.279> things<00:14:53.440> have<00:14:53.519> got have got better some things have got
893.8
have got better some things have got worse<00:14:54.079> i<00:14:54.160> think<00:14:54.320> mostly<00:14:54.720> it's<00:14:55.040> evolved worse i think mostly it’s evolved
896.9
worse i think mostly it’s evolved and<00:14:57.040> it<00:14:57.199> sort<00:14:57.440> of<00:14:57.519> just<00:14:57.760> takes<00:14:58.800> it's<00:14:58.959> a and it sort of just takes it’s a
899.3
and it sort of just takes it’s a slightly<00:14:59.760> different<00:15:00.079> form<00:15:00.480> and<00:15:00.720> shape<00:15:01.120> but slightly different form and shape but
901.3
slightly different form and shape but we’ve<00:15:01.519> not<00:15:01.760> really<00:15:02.079> moved<00:15:02.399> forward<00:15:02.880> as<00:15:02.959> sort we’ve not really moved forward as sort
903.1
we’ve not really moved forward as sort of<00:15:03.199> just<00:15:03.360> going<00:15:03.680> around of just going around
904.6
of just going around skirting<00:15:05.199> around<00:15:05.600> the<00:15:05.760> edge<00:15:07.279> um skirting around the edge um
908.9
skirting around the edge um i<00:15:09.040> think<00:15:09.600> i'll<00:15:09.760> go<00:15:09.920> back<00:15:10.160> to<00:15:10.240> what<00:15:10.399> i<00:15:10.480> said<00:15:10.639> at i think i’ll go back to what i said at
910.8
i think i’ll go back to what i said at the<00:15:10.880> beginning<00:15:11.360> i<00:15:11.519> think<00:15:11.760> black<00:15:12.160> lives<00:15:12.560> matter the beginning i think black lives matter
914.3
the beginning i think black lives matter have<00:15:14.880> really<00:15:15.360> changed<00:15:15.760> the<00:15:15.920> landscape<00:15:16.639> in have really changed the landscape in
916.7
have really changed the landscape in terms<00:15:17.120> of terms of
917.8
terms of making<00:15:18.399> people<00:15:19.120> think<00:15:19.440> about<00:15:20.480> what making people think about what
921
making people think about what positive<00:15:21.600> action<00:15:22.160> they<00:15:22.399> can<00:15:22.560> do<00:15:23.360> it's<00:15:23.600> not<00:15:23.760> a positive action they can do it’s not a
923.9
positive action they can do it’s not a question<00:15:24.320> more<00:15:24.560> analysis<00:15:25.279> more<00:15:25.519> reports<00:15:26.240> more question more analysis more reports more
926.6
question more analysis more reports more thinking<00:15:27.519> it's<00:15:27.839> what<00:15:28.160> action<00:15:28.639> can<00:15:28.800> we<00:15:38.839> take you

[Transcript of video is available at the bottom of this page]
In January 2020 Robert Julian-Borchak Williams of Detroit was arrested in front of his wife and children, interrogated, held for 30 hours in a detention centre, his DNA and fingerprints taken – all solely on the basis of a face recognition algorithm. He was later released and in court all charges were dropped. Mr Williams is black [1]. In this modern age we are surrounded by algorithms, equations that do things. These may be advertised in smart phones, smart TVs or smart cars, or hidden when we apply for a loan [2], search on the internet [3] or go to a hospital [4]. These algorithms typically revolve around data, and make predictions based on that data. When the data involves human beings, human biases and prejudices are in danger of affecting the behaviour of the algorithm. This can have potentially disastrous effects on people’s lives [5]. Through easy to understand recent examples Shail Patel explains how this arises and what we can do to counter these effects [6].

References

(1) Kashmir Hill, Wrongly accused by an Algorithm, New York Times, 24.6.20 https://www.nytimes.com/2020/06/24/technology/facial-recognition-arrest.html

(2) Bertrand, Jérémie, and Laurent Weill. “Do Algorithms Discriminate Against African Americans in Lending?.” Présentation à la Conférence Fintech and Digital Finance, Skema, Nice. 2019.

(3) Safiya Umoja Noble, (2020) Algorithms of Oppression: How Search Engines Reinforce Racism, NYU Press

(4) Ledford, Heidi. “Millions of black people affected by racial bias in health-care algorithms.” Nature, vol. 574, no. 7780, 2019, p. 608+

(5) Richardson, Rashida and Schultz, Jason and Crawford, Kate, Dirty Data, Bad Predictions: How Civil Rights Violations Impact Police Data, Predictive Policing Systems, and Justice (February 13, 2019). 94 N.Y.U. Law Review Online 192 (2019), https://ssrn.com/abstract=3333423

(6) Kusner, Matt & Loftus, Joshua. (2020). The long road to fairer algorithms. Nature. 578. 34-36. 10.1038/d41586-020-00274-3.

(7) Julian Goldsmith, A Level Shambles has lessons for Justice, Law Society Gazette, 24.8.20 https://www.lawgazette.co.uk/commentary-and-opinion/a-level-shambles-has-lessons-for-justice/5105403.article

Shail Patel is retired, previously Research Director for Digital Consumers & Markets, Unilever R&D

This article is from the free online

Decolonising Education: From Theory to Practice

Created by
FutureLearn - Learning For Life

Reach your personal and professional goals

Unlock access to hundreds of expert online courses and degrees from top universities and educators to gain accredited qualifications and professional CV-building certificates.

Join over 18 million learners to launch, switch or build upon your career, all at your own pace, across a wide range of topic areas.

Start Learning now