Find out more about Virtual Reality, Interaction and Haptics in this article by Professor Richard Mitchell.
Any interaction between a human and a machine is feedback – even putting the cursor on the screen in the right place by moving a mouse is a feedback process. Perhaps the most interesting human-computer interaction challenge is virtual reality.
Here the computer generates an artificial world with which it communicates with the human. Most of us are familiar with what a virtual reality environment looks like and we can see it using 3D goggles or a CAVE – which is a room where the world is projected onto the walls. Adding realistic sounds is easy. Being able to smell the virtual world can also be done. But the most interesting challenge is enabling the human to touch and feel items in the virtual world.
If the human moves his head, for instance, then he is looking somewhere else in the world, so the computer needs to know about the movement and send a revised version of the world to the human – a different view. The sounds may also change, etc. This is very much a feedback process.
Related to this is augmented reality, when the computer conveys images of the ‘real world’, onto which extra information is added. Head up displays for pilots in cockpits are an example. Next week we will see images of a robot moving around onto which text is projected showing its ‘emotions’.
This can be used for remote control or tele-operation. Robots are sometimes sent into an environment unsafe for humans, to do a task. But rather than preprogramming the robot for the task, a human can be provided with an image of what the robot ‘sees’ and use that image to command the robot to operate in a particular way.
It can also be used in robot-assisted surgery.
In some of these later applications, it is not only important for the human to see what is happening but also to feel it. One of the specialties at Reading is Haptics – which is about touch.
Here you might want to be able to feel different surfaces – are they smooth, or rough?
If you are trying to get a robot to move something, you may need to know how much force to apply, so you need to get force-feedback.
When we pick up an egg for instance, we need to grip it sufficiently firmly that it does not fall from our fingers, but if we press too hard, it breaks. Our finger tips have sensitive sensors and without realising, we just know how to grip sufficiently firmly so that the egg does not quite slip. This is a feedback process.
Haptic devices are ones which generate information about what a (potentially imaginary) object feels like, or how the object reacts when you push at it. Often these devices look a bit like a robot arm – you hold the device and it moves subtly giving your finger tips the sensation of feeling a surface, for example. Or the device pushes back at you, so you can feel a force.