Skip main navigation

New offer! Get 30% off your first 2 months of Unlimited Monthly. Start your subscription for just £35.99 £24.99. New subscribers only T&Cs apply

Find out more

Interview with Dr Hemma Philamore on decolonisation of robotics and artificial intelligence

In this interview Dr Hemma Philamore covers the decolonisation of robotics and artificial intelligence.
10.7
i<00:00:10.880> think<00:00:11.120> it<00:00:11.200> means<00:00:11.440> sort<00:00:11.599> of<00:00:11.840> improving<00:00:12.240> the i think it means sort of improving the i think it means sort of improving the accessibility accessibility
13.5
accessibility of<00:00:14.080> robots<00:00:14.719> and<00:00:15.280> ai<00:00:15.920> and<00:00:16.320> particularly of robots and ai and particularly
18
of robots and ai and particularly thinking<00:00:18.320> about<00:00:19.199> these<00:00:19.439> as<00:00:19.600> things<00:00:20.000> that<00:00:20.160> are thinking about these as things that are thinking about these as things that are very very
21.7
very integrated<00:00:22.240> with<00:00:22.480> human<00:00:22.800> users<00:00:23.119> so<00:00:23.279> we're integrated with human users so we’re
23.4
integrated with human users so we’re starting<00:00:23.680> to<00:00:23.840> see<00:00:24.000> robots<00:00:24.400> sort<00:00:24.640> of<00:00:24.800> out<00:00:25.039> in starting to see robots sort of out in
26
starting to see robots sort of out in in<00:00:26.080> the<00:00:26.160> real<00:00:26.400> world<00:00:26.720> and<00:00:26.960> sort<00:00:27.119> of in the real world and sort of
27.4
in the real world and sort of interfacing<00:00:27.920> with<00:00:28.080> people<00:00:28.240> that<00:00:28.400> don't<00:00:28.640> build interfacing with people that don’t build
29
interfacing with people that don’t build and<00:00:29.119> operate and operate
29.7
and operate robots<00:00:30.800> so<00:00:31.760> in<00:00:32.160> decolonizing<00:00:32.880> ai<00:00:33.200> and robots so in decolonizing ai and
33.3
robots so in decolonizing ai and robotics<00:00:33.920> i<00:00:34.000> think robotics i think
34.7
robotics i think it’s<00:00:35.200> about<00:00:35.520> trying<00:00:35.760> to<00:00:35.920> remove<00:00:36.960> the<00:00:37.200> sort<00:00:37.360> of it’s about trying to remove the sort of
38.2
it’s about trying to remove the sort of inherent<00:00:39.200> bias<00:00:40.079> you<00:00:40.239> know<00:00:40.320> that<00:00:40.480> comes<00:00:40.879> from inherent bias you know that comes from
41.4
inherent bias you know that comes from colonialism<00:00:43.200> but<00:00:43.440> sort<00:00:43.600> of<00:00:43.760> in<00:00:43.840> the<00:00:43.920> human colonialism but sort of in the human
44.3
colonialism but sort of in the human creators<00:00:44.960> of creators of
45.3
creators of robots<00:00:46.000> and<00:00:46.320> ai<00:00:46.719> and<00:00:46.879> trying<00:00:47.200> to<00:00:47.440> sort<00:00:47.680> of robots and ai and trying to sort of
48.1
robots and ai and trying to sort of remove<00:00:48.480> those<00:00:48.719> from remove those from
49.2
remove those from systems<00:00:49.680> that<00:00:49.760> we're<00:00:49.920> hoping<00:00:50.239> will<00:00:50.960> go<00:00:51.199> out systems that we’re hoping will go out
51.4
systems that we’re hoping will go out and<00:00:51.600> be<00:00:51.840> used<00:00:52.160> by and be used by
52.7
and be used by people<00:00:52.960> who<00:00:53.199> are<00:00:53.280> not<00:00:53.440> the<00:00:53.520> developers people who are not the developers
54
people who are not the developers themselves<00:00:54.480> but<00:00:54.640> are<00:00:54.800> there<00:00:55.199> the<00:00:55.360> people themselves but are there the people
56.3
themselves but are there the people um<00:00:57.199> that<00:00:57.600> sort<00:00:57.760> of<00:00:58.079> from<00:00:58.239> now<00:00:58.399> onwards<00:00:58.800> will<00:00:58.879> be um that sort of from now onwards will be
59.8
um that sort of from now onwards will be using<00:01:00.719> ai<00:01:01.039> and<00:01:01.120> robotics<00:01:01.600> products these<00:01:11.119> are<00:01:11.439> systems<00:01:11.840> that<00:01:12.000> are<00:01:12.159> designed<00:01:12.640> to these are systems that are designed to these are systems that are designed to work work
74.3
work autonomously<00:01:15.040> so<00:01:15.119> they're<00:01:15.280> designed<00:01:15.600> to<00:01:15.759> work autonomously so they’re designed to work
76.1
autonomously so they’re designed to work without<00:01:16.560> any<00:01:16.880> sort<00:01:17.040> of<00:01:17.200> human without any sort of human
78.1
without any sort of human uh<00:01:18.560> supervisory<00:01:19.280> input<00:01:19.680> or<00:01:19.840> minimal<00:01:20.560> human uh supervisory input or minimal human
80.9
uh supervisory input or minimal human supervisory<00:01:21.439> input<00:01:21.600> that's<00:01:21.840> like<00:01:22.000> so<00:01:22.159> the supervisory input that’s like so the
82.2
supervisory input that’s like so the sort<00:01:22.479> of<00:01:22.560> ideal<00:01:22.880> that<00:01:22.960> you're<00:01:23.119> working sort of ideal that you’re working
83.4
sort of ideal that you’re working towards<00:01:23.759> with<00:01:23.920> something<00:01:24.320> where towards with something where
85
towards with something where it<00:01:25.280> is<00:01:25.759> um<00:01:26.640> you<00:01:26.799> know<00:01:27.040> sort<00:01:27.200> of<00:01:27.360> deemed<00:01:27.680> or it is um you know sort of deemed or
87.9
it is um you know sort of deemed or developed<00:01:28.240> to<00:01:28.320> have<00:01:28.479> intelligent<00:01:28.960> behavior developed to have intelligent behavior
89.8
developed to have intelligent behavior so<00:01:30.000> in<00:01:30.159> that<00:01:30.960> that<00:01:31.200> system<00:01:31.600> that<00:01:31.680> you<00:01:31.840> then<00:01:32.079> put so in that that system that you then put
92.2
so in that that system that you then put out<00:01:32.400> into<00:01:32.560> the<00:01:32.640> world<00:01:32.960> needs<00:01:33.200> to<00:01:33.360> be<00:01:33.439> able<00:01:33.600> to out into the world needs to be able to
93.8
out into the world needs to be able to interact<00:01:34.400> with<00:01:34.640> anyone<00:01:35.040> and<00:01:35.200> everyone<00:01:35.520> that interact with anyone and everyone that
95.7
interact with anyone and everyone that it<00:01:35.840> encounters<00:01:36.400> without<00:01:36.880> some<00:01:37.119> sort<00:01:37.280> of<00:01:37.439> human it encounters without some sort of human
97.8
it encounters without some sort of human in<00:01:37.920> the<00:01:38.079> loop in the loop
98.7
in the loop to<00:01:38.880> sort<00:01:39.040> of<00:01:39.119> correct<00:01:39.439> its<00:01:39.600> behavior<00:01:40.799> so<00:01:41.360> i to sort of correct its behavior so i
101.5
to sort of correct its behavior so i think<00:01:41.840> in<00:01:42.000> designing<00:01:42.560> systems<00:01:42.960> that<00:01:43.280> are think in designing systems that are
103.9
think in designing systems that are biased<00:01:44.479> in<00:01:44.560> their<00:01:44.720> effectiveness<00:01:45.439> towards biased in their effectiveness towards
105.9
biased in their effectiveness towards one<00:01:46.159> group<00:01:46.479> or<00:01:46.640> another one group or another
108
one group or another basically<00:01:48.399> means<00:01:48.640> that<00:01:48.880> you're<00:01:49.200> you're<00:01:49.759> you basically means that you’re you’re you
109.9
basically means that you’re you’re you know<00:01:50.000> increasing<00:01:50.399> the<00:01:50.479> probability<00:01:50.960> of<00:01:51.040> your know increasing the probability of your
111.2
know increasing the probability of your system<00:01:51.520> failing<00:01:51.840> once<00:01:52.079> it's<00:01:52.320> out<00:01:52.560> in system failing once it’s out in
112.9
system failing once it’s out in in<00:01:52.960> the<00:01:53.119> world<00:01:54.000> so<00:01:54.240> i<00:01:54.320> think<00:01:54.640> in<00:01:55.280> trying<00:01:55.600> to in the world so i think in trying to
117.1
in the world so i think in trying to remove<00:01:57.600> sources<00:01:58.000> of<00:01:58.159> inherent<00:01:58.640> bias<00:01:58.960> or remove sources of inherent bias or
119
remove sources of inherent bias or remove<00:01:59.360> the<00:01:59.439> effects<00:01:59.840> of<00:01:59.920> inherent<00:02:00.320> bias remove the effects of inherent bias
121.3
remove the effects of inherent bias from<00:02:02.240> you<00:02:02.399> know<00:02:02.560> developing<00:02:02.960> robots from you know developing robots
123.4
from you know developing robots developing<00:02:03.759> ai<00:02:04.079> we<00:02:04.240> were<00:02:04.399> increasing<00:02:04.799> the developing ai we were increasing the developing ai we were increasing the success success
125.5
success of<00:02:06.159> the<00:02:06.320> probability<00:02:06.719> of<00:02:06.880> it<00:02:06.960> being<00:02:07.759> a of the probability of it being a of the probability of it being a successful successful
129.6
successful uh<00:02:10.000> technology<00:02:11.599> once<00:02:11.920> it's<00:02:12.080> out<00:02:12.319> of<00:02:12.480> the<00:02:12.560> hands uh technology once it’s out of the hands
132.9
uh technology once it’s out of the hands of<00:02:12.959> the<00:02:13.200> the<00:02:13.360> people<00:02:13.599> that<00:02:13.840> develop<00:02:14.160> it<00:02:14.239> the of the the people that develop it the
134.3
of the the people that develop it the people<00:02:14.560> that<00:02:14.640> create<00:02:14.959> it<00:02:15.040> and<00:02:15.200> it's<00:02:15.360> put people that create it and it’s put
135.6
people that create it and it’s put through<00:02:15.840> that<00:02:16.080> sort<00:02:16.239> of<00:02:16.400> rigorous through that sort of rigorous
137.3
through that sort of rigorous um<00:02:18.640> test<00:02:18.879> that<00:02:19.120> is<00:02:19.280> having<00:02:19.599> to<00:02:19.760> work<00:02:20.239> in um test that is having to work in
141.2
um test that is having to work in a<00:02:21.360> sort<00:02:21.520> of<00:02:21.760> stochastic<00:02:22.720> unpredictable<00:02:23.680> real a sort of stochastic unpredictable real a sort of stochastic unpredictable real world world world setting i<00:02:35.040> think<00:02:35.280> the<00:02:35.440> biggest<00:02:35.760> one<00:02:36.720> is<00:02:36.879> probably i think the biggest one is probably
159.2
i think the biggest one is probably the<00:02:39.360> fact<00:02:39.519> that<00:02:39.680> we<00:02:39.920> often<00:02:40.400> forget<00:02:41.680> the<00:02:41.840> sort the fact that we often forget the sort the fact that we often forget the sort of of
162.7
of human<00:02:44.000> agency<00:02:44.480> in<00:02:44.640> developing<00:02:45.200> things<00:02:45.519> that human agency in developing things that
165.8
human agency in developing things that are<00:02:45.840> supposed<00:02:46.239> to<00:02:46.319> be<00:02:46.480> kind<00:02:46.640> of<00:02:46.879> standalone are supposed to be kind of standalone
168.9
are supposed to be kind of standalone agents<00:02:49.200> by<00:02:49.440> themselves<00:02:49.840> so<00:02:50.000> there's<00:02:50.160> often<00:02:50.400> a agents by themselves so there’s often a
170.5
agents by themselves so there’s often a perception<00:02:50.959> of perception of
172.1
perception of artificial<00:02:52.640> intelligence<00:02:53.280> that<00:02:53.440> it's<00:02:53.599> a<00:02:53.680> sort artificial intelligence that it’s a sort
173.8
artificial intelligence that it’s a sort of<00:02:54.160> black<00:02:54.560> box of black box
176.1
of black box and<00:02:56.239> that<00:02:56.480> it<00:02:56.959> it's<00:02:57.280> some<00:02:57.440> kind<00:02:57.680> of<00:02:57.760> sort<00:02:57.920> of and that it it’s some kind of sort of
178.1
and that it it’s some kind of sort of magic<00:02:58.560> that<00:02:58.720> does<00:02:58.959> its<00:02:59.200> own magic that does its own
180
magic that does its own its<00:03:00.239> own<00:03:00.480> thing<00:03:01.680> but<00:03:02.000> there's<00:03:02.480> the<00:03:02.720> human<00:03:03.040> that its own thing but there’s the human that
183.2
its own thing but there’s the human that developed<00:03:03.680> that developed that
184.3
developed that algorithm<00:03:05.360> and<00:03:06.000> all<00:03:06.319> of<00:03:06.480> their algorithm and all of their
187.7
algorithm and all of their you<00:03:07.840> know<00:03:08.239> and<00:03:08.800> sometimes<00:03:09.120> unintentional you know and sometimes unintentional
189.8
you know and sometimes unintentional bias<00:03:10.080> goes<00:03:10.319> into<00:03:10.480> the<00:03:10.560> creation<00:03:11.200> of<00:03:11.360> that bias goes into the creation of that bias goes into the creation of that algorithm algorithm
193
algorithm um<00:03:13.519> and<00:03:13.760> so<00:03:14.000> i<00:03:14.159> think<00:03:15.440> the<00:03:15.599> biggest um and so i think the biggest
196.8
um and so i think the biggest problem<00:03:18.400> or<00:03:18.560> one<00:03:18.720> of<00:03:18.800> the<00:03:18.879> greatest problem or one of the greatest problem or one of the greatest challenges challenges
199.9
challenges with<00:03:20.560> um<00:03:21.280> you<00:03:21.440> know<00:03:21.599> trying<00:03:21.840> to<00:03:22.000> implement with um you know trying to implement
203.4
with um you know trying to implement this<00:03:23.760> sort<00:03:23.920> of<00:03:24.000> removal<00:03:24.400> of<00:03:24.480> biases<00:03:25.040> it's<00:03:25.120> so this sort of removal of biases it’s so
205.4
this sort of removal of biases it’s so ingrained<00:03:25.840> sometimes<00:03:26.239> unconsciously<00:03:26.799> in<00:03:26.959> the ingrained sometimes unconsciously in the
207
ingrained sometimes unconsciously in the people<00:03:27.360> that<00:03:27.599> are people that are
208.3
people that are building<00:03:29.200> and<00:03:29.360> developing<00:03:29.840> new<00:03:30.000> systems<00:03:30.959> so building and developing new systems so
211.1
building and developing new systems so you<00:03:31.280> can<00:03:31.440> try<00:03:31.599> to<00:03:31.840> build<00:03:32.159> something you can try to build something
212.7
you can try to build something as<00:03:33.599> you<00:03:33.680> know<00:03:35.360> inclusive as you know inclusive
216.1
as you know inclusive to<00:03:37.200> everyone<00:03:37.680> around<00:03:38.000> you<00:03:38.400> as<00:03:38.560> you<00:03:38.799> like<00:03:39.120> but to everyone around you as you like but
219.4
to everyone around you as you like but that’s<00:03:39.680> very that’s very
220.2
that’s very much<00:03:40.560> influenced<00:03:41.040> by<00:03:41.200> your<00:03:41.360> own<00:03:41.519> perspective much influenced by your own perspective
222.1
much influenced by your own perspective of<00:03:42.159> who<00:03:42.720> everyone<00:03:43.200> is of who everyone is
223.5
of who everyone is so<00:03:44.799> people<00:03:45.120> outside<00:03:45.440> of<00:03:45.519> your<00:03:45.680> immediate so people outside of your immediate
226.1
so people outside of your immediate circle<00:03:46.879> of circle of
227.1
circle of the<00:03:47.280> people<00:03:47.440> that<00:03:47.599> you<00:03:47.680> operate<00:03:48.000> where<00:03:48.159> the the people that you operate where the
228.2
the people that you operate where the people<00:03:48.480> that<00:03:48.560> you<00:03:48.720> interact<00:03:49.120> with<00:03:49.840> might<00:03:50.000> not people that you interact with might not
230.2
people that you interact with might not even<00:03:50.480> occur<00:03:50.720> to<00:03:50.879> you even occur to you
231.5
even occur to you in<00:03:51.680> developing<00:03:52.159> something<00:03:52.560> and<00:03:52.720> so<00:03:53.200> your<00:03:53.519> own in developing something and so your own
234.2
in developing something and so your own sort<00:03:54.400> of<00:03:54.480> limited<00:03:54.879> sphere<00:03:55.280> of sort of limited sphere of
236.6
sort of limited sphere of interaction<00:03:57.840> can<00:03:58.799> if<00:03:58.879> you're<00:03:59.200> you<00:03:59.360> know interaction can if you’re you know
239.5
interaction can if you’re you know developing<00:03:59.840> one<00:04:00.000> of<00:04:00.080> these<00:04:00.239> things<00:04:00.560> can<00:04:00.720> then developing one of these things can then
241.5
developing one of these things can then sort<00:04:01.760> of<00:04:01.840> feed<00:04:02.080> forward<00:04:02.400> into<00:04:02.640> the<00:04:02.720> thing<00:04:02.879> that sort of feed forward into the thing that
243
sort of feed forward into the thing that you<00:04:03.200> develop<00:04:03.599> and<00:04:04.159> and<00:04:04.400> sort<00:04:04.560> of you develop and and sort of
245
you develop and and sort of i<00:04:05.120> don't<00:04:05.280> know<00:04:05.519> impart<00:04:05.920> your<00:04:06.159> bias<00:04:06.560> into<00:04:06.959> into that uh<00:04:16.959> there's<00:04:17.199> sort<00:04:17.359> of<00:04:17.440> the<00:04:17.519> teaching<00:04:17.919> parts<00:04:18.239> of uh there’s sort of the teaching parts of
258.3
uh there’s sort of the teaching parts of that<00:04:18.479> and<00:04:18.560> then<00:04:18.880> there's<00:04:19.120> the<00:04:19.199> kind<00:04:19.359> of<00:04:19.440> the that and then there’s the kind of the
259.6
that and then there’s the kind of the research<00:04:20.000> part research part
260.6
research part to<00:04:20.799> it<00:04:21.359> um<00:04:22.400> and<00:04:22.560> the<00:04:22.639> research<00:04:23.040> side<00:04:23.280> of<00:04:23.360> things to it um and the research side of things
263.7
to it um and the research side of things i<00:04:23.759> think<00:04:23.919> there's i think there’s
264.6
i think there’s a<00:04:24.639> lot<00:04:24.800> of<00:04:24.880> things<00:04:25.120> that<00:04:25.199> we're<00:04:25.360> doing<00:04:26.000> revolve a lot of things that we’re doing revolve
266.4
a lot of things that we’re doing revolve around<00:04:27.280> the<00:04:27.440> idea<00:04:27.680> of<00:04:27.759> co-authorship around the idea of co-authorship
268.6
around the idea of co-authorship and<00:04:28.639> co-creation<00:04:29.360> and<00:04:29.440> sort<00:04:29.680> of and co-creation and sort of
269.8
and co-creation and sort of participatory<00:04:30.639> research<00:04:31.199> where participatory research where
271.8
participatory research where we<00:04:32.000> try<00:04:32.160> to<00:04:32.400> involve<00:04:33.120> people<00:04:33.840> or<00:04:34.000> we<00:04:34.160> do we try to involve people or we do
274.4
we try to involve people or we do actively<00:04:34.880> involve<00:04:35.280> people actively involve people
276.2
actively involve people who<00:04:36.400> are<00:04:36.560> maybe<00:04:36.960> outside<00:04:37.680> of<00:04:38.080> the<00:04:38.240> typical who are maybe outside of the typical
278.7
who are maybe outside of the typical groups<00:04:39.040> that<00:04:39.280> we<00:04:39.600> would groups that we would
280
groups that we would test<00:04:40.320> things<00:04:40.720> on<00:04:41.440> in<00:04:41.600> the<00:04:41.759> lab<00:04:42.000> partly<00:04:42.400> because test things on in the lab partly because
282.6
test things on in the lab partly because test<00:04:42.880> participants<00:04:43.360> tend<00:04:43.520> to<00:04:43.600> be<00:04:43.680> whoever test participants tend to be whoever
284.1
test participants tend to be whoever else<00:04:44.240> it<00:04:44.400> is<00:04:44.479> in<00:04:44.560> the<00:04:44.639> lab<00:04:44.880> at<00:04:44.960> the<00:04:45.040> time<00:04:45.280> so else it is in the lab at the time so
285.4
else it is in the lab at the time so they’re<00:04:45.600> people<00:04:46.000> who they’re people who
286.6
they’re people who maybe<00:04:46.960> already<00:04:47.199> work<00:04:47.360> in<00:04:47.440> robotics<00:04:48.160> or<00:04:48.479> who maybe already work in robotics or who
289.1
maybe already work in robotics or who already<00:04:49.440> work<00:04:49.600> in<00:04:49.680> the<00:04:49.759> university<00:04:50.240> they're already work in the university they’re
290.4
already work in the university they’re not<00:04:50.560> they're<00:04:50.720> only<00:04:50.960> so<00:04:51.120> many<00:04:51.280> degrees<00:04:51.600> of not they’re only so many degrees of
291.8
not they’re only so many degrees of separation<00:04:52.240> from<00:04:52.479> who separation from who
293.1
separation from who we<00:04:53.280> are<00:04:53.440> as<00:04:53.520> the<00:04:53.840> developers<00:04:54.320> of<00:04:54.400> these<00:04:54.639> things we are as the developers of these things we are as the developers of these things so so
296.7
so taking<00:04:57.040> an<00:04:57.199> example<00:04:57.840> of<00:04:58.240> something<00:04:58.560> that taking an example of something that
298.6
taking an example of something that we’re<00:04:58.800> doing<00:04:59.520> recently<00:05:00.000> is<00:05:00.160> looking<00:05:00.479> at<00:05:00.639> it's we’re doing recently is looking at it’s
301
we’re doing recently is looking at it’s not<00:05:01.280> sort<00:05:01.440> of not sort of
301.7
not sort of directly<00:05:02.240> related<00:05:02.800> to<00:05:03.600> decolonization<00:05:04.479> this directly related to decolonization this
304.6
directly related to decolonization this particular<00:05:05.039> example<00:05:05.440> but<00:05:05.680> it's particular example but it’s
306
particular example but it’s you<00:05:06.080> know<00:05:06.320> it<00:05:06.400> could<00:05:06.560> be<00:05:06.720> sort<00:05:06.880> of<00:05:07.039> um<00:05:07.919> extended you know it could be sort of um extended you know it could be sort of um extended uh uh
309.4
uh in<00:05:09.440> that<00:05:09.600> direction<00:05:10.320> but<00:05:10.880> running in that direction but running
311.4
in that direction but running co-creation<00:05:12.000> workshops<00:05:12.560> for<00:05:12.720> development<00:05:13.360> of co-creation workshops for development of
314.2
co-creation workshops for development of uh<00:05:14.560> soft<00:05:14.800> robots<00:05:15.520> who<00:05:16.080> with<00:05:16.240> the<00:05:16.400> aim<00:05:16.560> of uh soft robots who with the aim of
316.8
uh soft robots who with the aim of having<00:05:17.039> these<00:05:17.280> robots<00:05:17.759> has having these robots has
318.1
having these robots has sort<00:05:18.240> of<00:05:18.400> robot<00:05:18.720> avatars<00:05:19.280> for<00:05:19.520> people<00:05:19.840> who sort of robot avatars for people who sort of robot avatars for people who can’t can’t
321.8
can’t be<00:05:22.400> in<00:05:22.560> a<00:05:22.639> physical<00:05:22.960> space<00:05:23.280> because<00:05:23.520> they<00:05:23.680> are be in a physical space because they are
323.8
be in a physical space because they are socially<00:05:24.560> isolating<00:05:25.120> because<00:05:25.280> they're socially isolating because they’re
325.4
socially isolating because they’re isolating<00:05:25.840> for<00:05:26.000> another<00:05:26.400> another<00:05:26.720> reason isolating for another another reason
327.6
isolating for another another reason and<00:05:28.240> in<00:05:28.560> that<00:05:28.720> we're<00:05:28.880> sort<00:05:29.039> of<00:05:29.120> mostly<00:05:29.440> looking and in that we’re sort of mostly looking
329.6
and in that we’re sort of mostly looking at<00:05:29.919> elderly<00:05:30.320> people at elderly people
331
at elderly people but<00:05:31.120> there's<00:05:31.360> no<00:05:31.520> reason<00:05:31.759> that<00:05:31.919> that<00:05:32.160> couldn't but there’s no reason that that couldn’t
332.5
but there’s no reason that that couldn’t be<00:05:32.880> another<00:05:33.919> another<00:05:34.240> group be another another group
334.7
be another another group and<00:05:35.120> actually<00:05:35.360> there's<00:05:35.520> been<00:05:35.759> a<00:05:35.840> few<00:05:36.479> similar and actually there’s been a few similar
336.9
and actually there’s been a few similar workshops<00:05:37.360> working<00:05:37.600> with<00:05:37.759> other<00:05:38.240> people<00:05:38.479> for workshops working with other people for
338.6
workshops working with other people for the<00:05:38.720> for<00:05:38.880> the<00:05:38.960> perspective<00:05:39.360> of<00:05:39.520> different the for the perspective of different
339.7
the for the perspective of different things<00:05:39.919> in<00:05:40.000> the<00:05:40.080> robotics<00:05:40.639> lab things in the robotics lab
341.1
things in the robotics lab so<00:05:41.280> the<00:05:41.440> ideas<00:05:41.759> of<00:05:42.000> smart<00:05:42.320> clothing<00:05:42.720> were<00:05:42.880> also so the ideas of smart clothing were also
343.2
so the ideas of smart clothing were also done<00:05:43.680> in<00:05:43.759> our<00:05:43.919> group<00:05:44.080> and<00:05:44.240> a<00:05:44.240> lot<00:05:44.400> of<00:05:44.479> those done in our group and a lot of those
344.7
done in our group and a lot of those ideas<00:05:45.120> came<00:05:45.360> from ideas came from
347.3
ideas came from sort<00:05:47.440> of<00:05:47.680> workshop<00:05:48.240> ideas<00:05:48.560> with<00:05:48.720> the sort of workshop ideas with the
348.8
sort of workshop ideas with the potential<00:05:49.199> end<00:05:49.440> users<00:05:49.840> of<00:05:50.160> this<00:05:50.320> type<00:05:50.479> of potential end users of this type of potential end users of this type of thing thing
351.4
thing so<00:05:51.520> there's<00:05:51.680> an<00:05:51.840> element<00:05:52.160> of<00:05:52.240> trying<00:05:52.479> to<00:05:52.560> bring so there’s an element of trying to bring
352.8
so there’s an element of trying to bring people<00:05:53.039> in<00:05:53.120> who<00:05:53.280> are<00:05:53.360> going<00:05:53.440> to<00:05:53.520> use<00:05:53.680> this people in who are going to use this
353.8
people in who are going to use this technology<00:05:54.320> in<00:05:54.400> an<00:05:54.560> early<00:05:54.800> stage<00:05:55.039> in<00:05:55.120> its technology in an early stage in its technology in an early stage in its development development
356.2
development on<00:05:56.560> in<00:05:57.039> research<00:05:58.240> and<00:05:58.319> then<00:05:58.560> in<00:05:58.720> teaching on in research and then in teaching
359.5
on in research and then in teaching um<00:05:59.919> so<00:06:00.160> for<00:06:00.319> example<00:06:00.720> the<00:06:00.800> module<00:06:01.120> that<00:06:01.360> i<00:06:01.759> i um so for example the module that i i
362
um so for example the module that i i teach<00:06:02.240> on teach on
362.6
teach on mathematical<00:06:03.199> data<00:06:03.440> modeling<00:06:04.479> so<00:06:05.280> before<00:06:05.600> i mathematical data modeling so before i
365.7
mathematical data modeling so before i talk<00:06:05.919> about<00:06:06.160> that talk about that
367
talk about that i<00:06:07.120> think<00:06:07.360> it<00:06:07.520> can<00:06:07.759> be<00:06:07.919> really<00:06:08.160> difficult<00:06:08.639> to i think it can be really difficult to
368.8
i think it can be really difficult to try<00:06:09.039> and<00:06:09.199> identify<00:06:09.919> how<00:06:10.160> to try and identify how to
370.6
try and identify how to decolonize<00:06:11.520> taurt<00:06:11.840> subjects<00:06:12.479> in<00:06:12.800> stem decolonize taurt subjects in stem decolonize taurt subjects in stem particularly particularly
373.8
particularly partly<00:06:14.160> because<00:06:14.400> there<00:06:14.560> is<00:06:14.720> so<00:06:14.880> much<00:06:15.120> emphasis partly because there is so much emphasis
375.8
partly because there is so much emphasis on<00:06:16.639> highly<00:06:17.039> or<00:06:17.199> there's<00:06:17.440> so<00:06:17.600> much<00:06:17.759> focus on highly or there’s so much focus
378.3
on highly or there’s so much focus necessarily<00:06:19.039> on<00:06:19.280> highly<00:06:19.600> technical<00:06:20.800> subjects necessarily on highly technical subjects
381.4
necessarily on highly technical subjects and<00:06:21.520> topics and topics
382.6
and topics and<00:06:22.800> the<00:06:22.960> teaching<00:06:23.840> is<00:06:24.000> sometimes and the teaching is sometimes
385.7
and the teaching is sometimes you<00:06:25.759> know<00:06:25.919> potentially<00:06:26.319> sort<00:06:26.400> of<00:06:26.560> agnostic<00:06:27.280> to you know potentially sort of agnostic to
387.8
you know potentially sort of agnostic to um<00:06:28.240> to um to
389.1
um to social<00:06:30.639> influence<00:06:31.120> or<00:06:31.199> social<00:06:31.520> issues social influence or social issues
392.6
social influence or social issues but<00:06:32.880> in<00:06:33.199> subjects<00:06:33.759> like<00:06:34.319> uh<00:06:34.639> mathematical but in subjects like uh mathematical
395.3
but in subjects like uh mathematical data<00:06:35.520> modeling<00:06:35.840> which<00:06:36.000> is<00:06:36.160> a<00:06:36.240> unit<00:06:36.479> that<00:06:36.560> i data modeling which is a unit that i
396.7
data modeling which is a unit that i teach<00:06:36.960> on teach on
397.3
teach on at<00:06:37.759> bristol<00:06:38.800> in<00:06:38.960> that<00:06:39.120> case<00:06:39.919> we're<00:06:40.080> looking<00:06:40.400> at at bristol in that case we’re looking at
401.5
at bristol in that case we’re looking at some<00:06:41.680> constant<00:06:42.080> focus<00:06:42.479> of<00:06:42.800> the<00:06:42.960> units<00:06:43.280> to some constant focus of the units to
403.4
some constant focus of the units to bring<00:06:43.680> in<00:06:44.240> problems<00:06:44.639> from<00:06:44.880> outside<00:06:45.280> of bring in problems from outside of
406.2
bring in problems from outside of engineering<00:06:46.720> maths<00:06:46.960> and<00:06:47.039> then<00:06:47.280> in<00:06:47.360> the<00:06:47.440> best engineering maths and then in the best
407.7
engineering maths and then in the best case<00:06:47.759> study<00:06:48.080> outside<00:06:48.319> of<00:06:48.400> the<00:06:48.560> university case study outside of the university
409.4
case study outside of the university completely<00:06:50.479> um<00:06:50.720> so<00:06:50.880> these<00:06:51.039> are<00:06:51.199> kind<00:06:51.280> of completely um so these are kind of completely um so these are kind of data-driven data-driven
412.6
data-driven problems<00:06:53.120> either<00:06:53.360> on<00:06:53.520> sort<00:06:53.680> of<00:06:53.759> physical problems either on sort of physical
414
problems either on sort of physical modeling<00:06:54.319> or<00:06:54.479> data<00:06:54.720> modeling<00:06:55.360> um<00:06:55.599> or<00:06:55.840> other modeling or data modeling um or other
417
modeling or data modeling um or other types<00:06:57.199> of<00:06:57.840> mathematical<00:06:58.319> modeling<00:06:58.560> but types of mathematical modeling but
418.7
types of mathematical modeling but looking<00:06:58.960> at<00:06:59.120> translating<00:06:59.680> a<00:06:59.759> real<00:06:59.919> world looking at translating a real world looking at translating a real world issue issue
421
issue into<00:07:01.680> a<00:07:01.759> model<00:07:02.080> to<00:07:02.240> try<00:07:02.479> and<00:07:02.800> answer<00:07:03.120> some<00:07:03.759> some into a model to try and answer some some
424
into a model to try and answer some some open<00:07:04.240> questions<00:07:04.960> about<00:07:05.280> it open questions about it
425.5
open questions about it so<00:07:05.759> in<00:07:05.840> that<00:07:06.080> we're<00:07:06.240> trying<00:07:06.479> to<00:07:06.639> get<00:07:06.880> students so in that we’re trying to get students
427.4
so in that we’re trying to get students to<00:07:07.759> well<00:07:07.919> first<00:07:08.160> of<00:07:08.240> all<00:07:08.319> making<00:07:08.560> this<00:07:08.720> an to well first of all making this an
428.9
to well first of all making this an assessed<00:07:09.280> part<00:07:09.520> of<00:07:09.599> the assessed part of the
430.2
assessed part of the course<00:07:10.560> so<00:07:10.720> we're<00:07:10.960> making<00:07:11.199> it<00:07:11.280> compulsory<00:07:11.840> for course so we’re making it compulsory for
432.1
course so we’re making it compulsory for students<00:07:12.479> to students to
433.6
students to try<00:07:13.759> to<00:07:13.919> identify<00:07:14.800> and<00:07:15.199> make<00:07:15.440> recommendations try to identify and make recommendations
436.1
try to identify and make recommendations to<00:07:16.160> mitigate<00:07:16.639> those<00:07:16.800> sources<00:07:17.199> of<00:07:17.280> bias to mitigate those sources of bias
437.8
to mitigate those sources of bias in<00:07:18.319> the<00:07:18.400> data<00:07:18.720> that<00:07:18.800> they're<00:07:18.960> working<00:07:19.280> with in the data that they’re working with in the data that they’re working with and and
440.9
and so<00:07:21.039> in<00:07:21.199> that<00:07:21.360> you<00:07:21.440> know<00:07:21.759> it's<00:07:22.000> it's<00:07:22.160> not<00:07:22.479> a<00:07:23.120> sort so in that you know it’s it’s not a sort so in that you know it’s it’s not a sort of of
443.9
of trivial<00:07:24.319> thing<00:07:24.560> at<00:07:24.720> all<00:07:24.880> that's<00:07:25.120> a<00:07:25.280> very trivial thing at all that’s a very trivial thing at all that’s a very significant significant
447.8
significant thing<00:07:28.160> that's<00:07:28.319> going<00:07:28.479> to<00:07:28.560> have<00:07:28.720> this<00:07:28.880> sort<00:07:29.039> of thing that’s going to have this sort of
449.1
thing that’s going to have this sort of measurable<00:07:29.599> impact<00:07:30.080> on<00:07:30.639> not<00:07:30.800> only<00:07:31.039> the measurable impact on not only the
451.6
measurable impact on not only the the<00:07:31.759> success<00:07:32.160> of<00:07:32.240> the<00:07:32.400> students<00:07:32.800> project<00:07:33.280> in the success of the students project in
453.4
the success of the students project in that<00:07:33.680> course<00:07:34.080> but<00:07:34.240> then that course but then
454.6
that course but then you<00:07:34.720> know<00:07:34.880> they're<00:07:35.039> going<00:07:35.120> to<00:07:35.199> be<00:07:35.919> a<00:07:36.000> data you know they’re going to be a data
456.3
you know they’re going to be a data engineer<00:07:36.720> for<00:07:36.960> example engineer for example
457.9
engineer for example in<00:07:38.319> in<00:07:38.400> their<00:07:38.560> future<00:07:38.880> careers<00:07:39.199> of<00:07:39.280> the<00:07:39.360> future in in their future careers of the future
459.6
in in their future careers of the future products<00:07:39.919> that<00:07:40.000> they're<00:07:40.160> developing<00:07:40.639> in<00:07:40.720> the products that they’re developing in the
461
products that they’re developing in the you<00:07:41.120> know<00:07:41.280> future<00:07:41.599> work<00:07:41.840> that<00:07:41.919> they're<00:07:42.080> doing you know future work that they’re doing
462.9
you know future work that they’re doing so<00:07:43.280> i<00:07:43.440> think so i think
464.7
so i think there’s<00:07:44.960> these<00:07:45.199> two<00:07:45.680> those<00:07:45.919> are<00:07:46.479> two<00:07:47.199> two there’s these two those are two two
467.9
there’s these two those are two two ongoing<00:07:48.400> things<00:07:48.639> that<00:07:48.720> i<00:07:48.800> think<00:07:49.039> are<00:07:49.120> working ongoing things that i think are working
469.4
ongoing things that i think are working quite<00:07:49.520> successfully<00:07:50.080> at<00:07:50.160> the<00:07:50.240> moment<00:07:50.639> is quite successfully at the moment is
471.1
quite successfully at the moment is you<00:07:51.199> know<00:07:51.360> these<00:07:51.520> two<00:07:51.759> ideas<00:07:52.080> of<00:07:52.160> bringing<00:07:52.479> in you know these two ideas of bringing in
472.6
you know these two ideas of bringing in people<00:07:53.039> who<00:07:53.199> are<00:07:53.360> going<00:07:53.520> to<00:07:53.680> use<00:07:53.919> this people who are going to use this people who are going to use this technology technology
475.2
technology at<00:07:55.360> an<00:07:55.440> early<00:07:55.680> stage<00:07:56.000> who<00:07:56.160> are<00:07:56.240> outside<00:07:56.639> of<00:07:56.800> i at an early stage who are outside of i
477
at an early stage who are outside of i direct<00:07:57.759> you<00:07:57.919> know direct you know
478.5
direct you know sort<00:07:58.639> of<00:07:59.919> direct<00:08:00.240> networks<00:08:01.199> and<00:08:01.280> then sort of direct networks and then
482
sort of direct networks and then building<00:08:02.400> in<00:08:03.199> sort<00:08:03.360> of<00:08:03.840> building<00:08:04.240> into<00:08:04.400> the building in sort of building into the building in sort of building into the curriculum curriculum
485.6
curriculum non-trivial<00:08:06.879> elements<00:08:07.440> that<00:08:08.000> make<00:08:08.240> students non-trivial elements that make students
488.6
non-trivial elements that make students think<00:08:08.800> about think about
489.2
think about why<00:08:09.680> bias<00:08:10.240> by<00:08:10.560> why<00:08:10.720> removing<00:08:11.199> sources<00:08:11.520> of<00:08:11.599> bias why bias by why removing sources of bias
492.4
why bias by why removing sources of bias is<00:08:12.639> important<00:08:13.840> not<00:08:14.000> only<00:08:14.160> for<00:08:14.319> the<00:08:14.400> purpose<00:08:14.720> of is important not only for the purpose of
494.8
is important not only for the purpose of getting<00:08:15.120> a<00:08:15.199> good<00:08:15.440> mark<00:08:15.680> or<00:08:15.840> being<00:08:16.000> a<00:08:16.080> good getting a good mark or being a good getting a good mark or being a good person person
497.3
person but<00:08:17.520> for<00:08:18.080> you<00:08:18.160> know<00:08:18.319> the<00:08:18.479> success<00:08:18.879> of<00:08:18.960> what but for you know the success of what
499.1
but for you know the success of what they’re<00:08:19.280> doing<00:08:19.599> and<00:08:19.680> how<00:08:20.240> that they’re doing and how that
501
they’re doing and how that works<00:08:21.440> most<00:08:21.680> effectively<00:08:22.240> in<00:08:22.639> the<00:08:22.720> real<00:08:24.840> world yeah<00:08:32.240> i<00:08:32.320> think<00:08:32.479> it's<00:08:32.560> a<00:08:32.640> good<00:08:32.880> question<00:08:33.279> it's yeah i think it’s a good question it’s
513.4
yeah i think it’s a good question it’s something<00:08:33.760> that<00:08:33.919> was<00:08:34.320> um something that was um
515.2
something that was um talked<00:08:35.599> about<00:08:35.919> in<00:08:36.000> a<00:08:36.159> paper<00:08:36.479> that<00:08:36.640> you<00:08:36.719> shared talked about in a paper that you shared
517
talked about in a paper that you shared with<00:08:37.120> me<00:08:37.760> uh<00:08:38.159> not<00:08:38.320> very<00:08:38.560> long<00:08:38.719> ago<00:08:38.959> and<00:08:39.039> it's<00:08:39.200> a with me uh not very long ago and it’s a
519.3
with me uh not very long ago and it’s a really<00:08:39.519> interesting really interesting
520.6
really interesting um<00:08:42.080> and<00:08:42.240> quite<00:08:42.479> difficult<00:08:42.959> question<00:08:43.279> of<00:08:43.440> if um and quite difficult question of if
523.6
um and quite difficult question of if you<00:08:43.760> give you give
524.6
you give a<00:08:44.800> robot<00:08:45.519> sort<00:08:45.680> of<00:08:46.240> an<00:08:46.399> implied<00:08:47.360> human a robot sort of an implied human
528.2
a robot sort of an implied human race<00:08:49.279> to<00:08:49.440> the<00:08:49.519> potential<00:08:50.080> sort<00:08:50.240> of<00:08:50.320> social race to the potential sort of social race to the potential sort of social implications implications
532.1
implications of<00:08:52.720> you<00:08:52.800> know<00:08:52.959> that<00:08:53.120> apply<00:08:53.360> to<00:08:53.440> humans<00:08:53.839> then of you know that apply to humans then
534.2
of you know that apply to humans then then<00:08:54.480> apply<00:08:54.959> applies<00:08:55.200> to<00:08:55.279> the<00:08:55.360> robot<00:08:55.839> so then apply applies to the robot so
536.4
then apply applies to the robot so a<00:08:56.480> lot<00:08:56.640> of<00:08:56.800> robots<00:08:57.440> humanoid<00:08:57.839> robots<00:08:58.480> are<00:08:58.800> sort a lot of robots humanoid robots are sort a lot of robots humanoid robots are sort of of
540
of made<00:09:00.240> to<00:09:00.399> look<00:09:00.959> made<00:09:01.200> to<00:09:01.279> look<00:09:01.519> white<00:09:02.399> um<00:09:02.880> and made to look made to look white um and
543
made to look made to look white um and there’s<00:09:03.200> a<00:09:03.279> lot<00:09:03.440> of<00:09:03.519> you<00:09:03.600> know<00:09:03.680> that's<00:09:03.920> that's there’s a lot of you know that’s that’s
544.1
there’s a lot of you know that’s that’s problematic<00:09:04.640> for<00:09:04.800> a<00:09:04.880> lot<00:09:04.959> of<00:09:05.040> different problematic for a lot of different
545.4
problematic for a lot of different reasons<00:09:05.760> but<00:09:05.839> there's<00:09:06.080> also<00:09:06.839> this reasons but there’s also this
548.5
reasons but there’s also this thing<00:09:08.880> in<00:09:09.360> robotics<00:09:09.920> the<00:09:10.000> uncanny<00:09:10.480> valley thing in robotics the uncanny valley
550.7
thing in robotics the uncanny valley where<00:09:10.959> people<00:09:11.279> have<00:09:11.519> this<00:09:11.760> kind<00:09:12.000> of where people have this kind of
553
where people have this kind of rejection<00:09:13.440> of<00:09:13.600> things<00:09:13.839> that<00:09:13.920> look<00:09:14.160> too rejection of things that look too
554.4
rejection of things that look too similar<00:09:14.720> to<00:09:14.880> humans<00:09:15.279> this<00:09:15.440> sort<00:09:15.600> of similar to humans this sort of
556
similar to humans this sort of where<00:09:16.240> things<00:09:16.640> become<00:09:17.040> you<00:09:17.120> know<00:09:17.279> more<00:09:17.440> and where things become you know more and
557.6
where things become you know more and more<00:09:17.760> and<00:09:17.839> more<00:09:18.000> familiar<00:09:18.720> and<00:09:18.880> people<00:09:19.200> sort more and more familiar and people sort
559.4
more and more familiar and people sort of<00:09:19.440> warms<00:09:19.680> them<00:09:19.839> and<00:09:19.920> then<00:09:20.080> this<00:09:20.320> massive of warms them and then this massive
560.6
of warms them and then this massive drop-off<00:09:20.959> where<00:09:21.120> things<00:09:21.360> become<00:09:21.680> so<00:09:21.920> close<00:09:22.320> to drop-off where things become so close to
563.1
drop-off where things become so close to actually<00:09:23.920> resembling<00:09:24.560> humans<00:09:24.959> that<00:09:25.040> they actually resembling humans that they
565.2
actually resembling humans that they become<00:09:25.440> creepy<00:09:25.839> and<00:09:26.000> people<00:09:26.160> don't<00:09:26.399> actually become creepy and people don’t actually
566.7
become creepy and people don’t actually want<00:09:26.880> to want to
567.1
want to to<00:09:27.279> interact<00:09:27.680> with<00:09:27.760> them<00:09:28.399> so<00:09:29.040> i<00:09:29.200> don't<00:09:29.680> think to interact with them so i don’t think
569.9
to interact with them so i don’t think there<00:09:30.160> is<00:09:30.320> necessarily there is necessarily
571.7
there is necessarily a<00:09:31.920> need<00:09:32.560> for<00:09:33.680> robots a need for robots
574.9
a need for robots i<00:09:35.040> mean<00:09:35.279> for<00:09:35.440> robots<00:09:35.839> to<00:09:36.000> resemble<00:09:37.120> humans i mean for robots to resemble humans
577.7
i mean for robots to resemble humans necessarily<00:09:38.480> in<00:09:38.640> a<00:09:38.720> lot<00:09:38.880> of<00:09:39.040> different<00:09:39.360> tasks necessarily in a lot of different tasks
579.7
necessarily in a lot of different tasks and<00:09:39.839> there's<00:09:39.920> a<00:09:40.000> lot<00:09:40.080> of<00:09:40.160> different<00:09:40.399> tasks and there’s a lot of different tasks
580.7
and there’s a lot of different tasks where<00:09:40.800> a<00:09:40.880> humanoid<00:09:41.279> robot<00:09:41.680> is<00:09:41.760> not where a humanoid robot is not
581.9
where a humanoid robot is not necessarily<00:09:42.399> the<00:09:42.640> best necessarily the best
583.4
necessarily the best suited<00:09:44.080> you<00:09:44.160> know<00:09:44.399> morphology<00:09:45.040> to<00:09:45.519> to<00:09:45.760> achieve suited you know morphology to to achieve
586
suited you know morphology to to achieve whatever<00:09:46.399> task<00:09:46.640> it's<00:09:46.800> trying<00:09:46.959> to<00:09:47.120> do whatever task it’s trying to do
588.4
whatever task it’s trying to do i<00:09:48.480> don't<00:09:48.640> think<00:09:48.800> we<00:09:48.959> necessarily<00:09:49.440> need<00:09:49.600> to i don’t think we necessarily need to
589.7
i don’t think we necessarily need to build<00:09:49.920> in<00:09:50.080> race<00:09:50.320> as<00:09:50.399> a<00:09:50.480> human<00:09:50.720> character build in race as a human character build in race as a human character characteristic characteristic
592.2
characteristic into<00:09:52.720> robots<00:09:53.519> the<00:09:53.680> same<00:09:53.920> way<00:09:54.160> that<00:09:54.480> there's<00:09:54.640> a into robots the same way that there’s a
594.7
into robots the same way that there’s a lot<00:09:54.880> of<00:09:55.040> other<00:09:55.200> human<00:09:55.440> characteristics<00:09:56.240> that lot of other human characteristics that
596.6
lot of other human characteristics that are<00:09:57.519> potentially<00:09:58.000> becoming<00:09:58.480> more<00:09:58.640> and<00:09:58.800> more are potentially becoming more and more are potentially becoming more and more unnecessary unnecessary
599.8
unnecessary in<00:10:00.320> in<00:10:00.480> robots<00:10:00.959> as<00:10:01.120> we<00:10:01.920> focus<00:10:02.320> on in in robots as we focus on
603.6
in in robots as we focus on the<00:10:03.760> things<00:10:04.000> that<00:10:04.160> make<00:10:04.320> them<00:10:04.480> best<00:10:04.800> suited<00:10:05.279> to the things that make them best suited to
605.8
the things that make them best suited to to<00:10:06.000> a<00:10:06.079> task to a task
606.6
to a task if<00:10:06.720> we<00:10:06.880> look<00:10:07.040> at<00:10:07.120> caring<00:10:07.440> robots<00:10:07.920> they<00:10:08.800> don't if we look at caring robots they don’t
609.1
if we look at caring robots they don’t necessarily<00:10:09.680> look necessarily look
610.2
necessarily look exactly<00:10:10.720> like<00:10:10.959> humans<00:10:11.360> in<00:10:11.519> fact<00:10:11.680> they<00:10:11.839> look exactly like humans in fact they look
612.1
exactly like humans in fact they look kind<00:10:12.320> of<00:10:12.959> far<00:10:13.200> from<00:10:13.440> it kind of far from it
613.8
kind of far from it and<00:10:15.279> if<00:10:15.519> we<00:10:15.680> look<00:10:15.839> at<00:10:16.000> things<00:10:16.480> like and if we look at things like
617
and if we look at things like robots<00:10:17.360> that<00:10:17.440> are<00:10:17.600> used<00:10:18.240> to<00:10:18.800> try<00:10:19.120> to robots that are used to try to
621.3
robots that are used to try to have<00:10:21.519> a<00:10:21.600> sort<00:10:21.839> of<00:10:22.880> a<00:10:23.040> calming<00:10:23.519> or<00:10:23.680> a<00:10:23.760> friendly have a sort of a calming or a friendly
624.2
have a sort of a calming or a friendly or<00:10:24.320> a<00:10:24.480> kind<00:10:24.640> of or a kind of
625.2
or a kind of companion<00:10:26.079> type<00:10:26.320> relationship<00:10:26.880> with<00:10:26.959> a<00:10:27.040> human companion type relationship with a human
627.3
companion type relationship with a human they’re<00:10:27.440> often<00:10:27.600> sort<00:10:27.760> of<00:10:28.160> animal they’re often sort of animal
629.2
they’re often sort of animal robots<00:10:29.600> animatronic<00:10:30.079> robots<00:10:30.399> that<00:10:30.480> look<00:10:30.720> more robots animatronic robots that look more
630.9
robots animatronic robots that look more like<00:10:31.360> a<00:10:31.440> pet<00:10:31.680> for<00:10:31.839> example<00:10:32.240> than like a pet for example than
633.4
like a pet for example than a<00:10:33.519> human<00:10:38.279> being

[See bottom of page for full transcript of this video]
Mitigation for inherent bias in robotics and AI, including bias with its origin in colonialism is an important part of teaching and research because of its tangible and measurable effect on the success of robots and autonomous systems deployed in the real world. The agency of the human developer and the influence of any bias they may hold, intentional or unintentional, on the systems they work on is often overlooked. Failing to identify and act on this can negatively affect the success of robots and algorithms when deployed in the real world and can result in outcomes that discriminate against certain users.

Ongoing strategies to counter this at University of Bristol include initiatives in both research and teaching. In robotics research for example, we are engaging communities that are often overlooked in engineering in participatory research to co-author robot behaviours early on in the development process. In teaching we are introducing components of assignments that require students to identify and suggest mitigations for sources of discriminatory bias in their mathematical modelling and algorithms. Perhaps most importantly, we recognise that we are not coming from the perspective of a system that has got it right. By stimulating students to consider sources of bias in their work, our goal is to nurture the development of the next generation of engineers as individuals capable of developing solutions to current problems of discrimination in robotics and AI.

This article is from the free online

Decolonising Education: From Theory to Practice

Created by
FutureLearn - Learning For Life

Reach your personal and professional goals

Unlock access to hundreds of expert online courses and degrees from top universities and educators to gain accredited qualifications and professional CV-building certificates.

Join over 18 million learners to launch, switch or build upon your career, all at your own pace, across a wide range of topic areas.

Start Learning now