Skip main navigation

Robot–human interactions

Robots interact with the world in which they exist and with others, but the amount of interaction varies. Professor Richard Mitchell explains more.
© University of Reading
An important aspect of our approach to robots is their interaction with the world in which they exist and with others – particularly humans.
Currently, in practice, all real robots interact with humans to an extent. However the amount of interaction varies, and this reflects the extent to which a robot works for itself or relies on us. To what extent is the robot autonomous? How much does it do things for itself?
Consider this figure:
Increasing human input appears above an arrow pointing right. Under the arrow reads: autonomous, commanded and remote-controlled
Figure 1: Stages of human input. © University of Reading
At the left side of this, for a robot to work, the stages are:
  1. Robot turned on
  2. Robot is told what to do
  3. Robot does that certain action
  4. Robot reports back (then is fixed if it goes wrong)
This is an ‘autonomous’ robot – one that can make a lot of decisions for itself. The Mars Rover for instance, is commanded from Earth to go to a certain point and take a rock sample – but the robot tackles the local navigation to get to the right place for itself and completes the necessary movements to take the sample.
Near the other end of the spectrum we have robots that rely on humans directing their activities most of the time – typically with a remote controller. Here the stages are:
  • Robot turned on
  • Continually
    • Human tells robot what to do
    • Robot does that action
  • Until task is done or the human is bored!
Examples include robot toys steered by humans using a remote control joy stick.
At the right hand end of the figure, there is intimate interaction with the robot – such as the use of haptic devices – where the human holds the robot, whose subtle movements give the impression of feeling something which is not there. These are used in Virtual Reality. Here the stages are:
  1. User in contact with end of robot arm
  2. Arm moves slightly to give impression you are touching something
  3. Arm can also push, so you can feel force – so called force feedback
We discuss haptics and the Virtual Drum Kit in Week 3.
The level of autonomy is not a measure of sophistication. Both a simple remote controlled mechanical toy and the NASA space arm require direct input from a human. Both a complex industrial robot and other simple toys can be ones which are just turned on, told what to do, and just do them.
© University of Reading
This article is from the free online

Begin Robotics

Created by
FutureLearn - Learning For Life

Our purpose is to transform access to education.

We offer a diverse selection of courses from leading universities and cultural institutions from around the world. These are delivered one step at a time, and are accessible on mobile, tablet and desktop, so you can fit learning around your life.

We believe learning should be an enjoyable, social experience, so our courses offer the opportunity to discuss what you’re learning with others as you go, helping you make fresh discoveries and form new ideas.
You can unlock new opportunities with unlimited access to hundreds of online short courses for a year by subscribing to our Unlimited package. Build your knowledge with top universities and organisations.

Learn more about how FutureLearn is transforming access to education

close
  • 30% off Futurelearn Unlimited!