Want to keep learning?

This content is taken from the University of Reading's online course, Begin Robotics. Join the course to learn more.

Robot–human interactions

An important aspect of our approach to robots is their interaction with the world in which they exist and with others - particularly humans.

Currently, in practice, all real robots interact with humans to an extent. However the amount of interaction varies, and this reflects the extent to which a robot works for itself or relies on us. To what extent is the robot autonomous? How much does it do things for itself?

Consider this figure:

Increasing human input appears above an arrow pointing right. Under the arrow reads: autonomous, commanded and remote-controlled

Figure 1: Stages of human input. © University of Reading

At the left side of this, for a robot to work, the stages are:

  1. Robot turned on
  2. Robot is told what to do
  3. Robot does that certain action
  4. Robot reports back (then is fixed if it goes wrong)

This is an ‘autonomous’ robot - one that can make a lot of decisions for itself. The Mars Rover for instance, is commanded from Earth to go to a certain point and take a rock sample - but the robot tackles the local navigation to get to the right place for itself and completes the necessary movements to take the sample.

Near the other end of the spectrum we have robots that rely on humans directing their activities most of the time - typically with a remote controller. Here the stages are:

  • Robot turned on
  • Continually
    • Human tells robot what to do
    • Robot does that action
  • Until task is done or the human is bored!

Examples include robot toys steered by humans using a remote control joy stick.

At the right hand end of the figure, there is intimate interaction with the robot - such as the use of haptic devices - where the human holds the robot, whose subtle movements give the impression of feeling something which is not there. These are used in Virtual Reality. Here the stages are:

  1. User in contact with end of robot arm
  2. Arm moves slightly to give impression you are touching something
  3. Arm can also push, so you can feel force - so called force feedback

We discuss haptics and the Virtual Drum Kit in Week 3.

The level of autonomy is not a measure of sophistication. Both a simple remote controlled mechanical toy and the NASA space arm require direct input from a human. Both a complex industrial robot and other simple toys can be ones which are just turned on, told what to do, and just do them.

Share this article:

This article is from the free online course:

Begin Robotics

University of Reading

Get a taste of this course

Find out what this course is like by previewing some of the course steps before you join: