Skip main navigation

New offer! Get 30% off your first 2 months of Unlimited Monthly. Start your subscription for just £35.99 £24.99. New subscribers only T&Cs apply

Find out more

Meet the robots at Reading: Haptic Master

Demo of the Haptic Master in action; a multi-sensory robotic arm that allows users to explore different textures in 3D virtual reality environments.
8.1
OK. I’m Andrew Glennerster. I’m head of the VR lab here. And the purpose of the lab is to make stimuli more and more realistic, more and more like the real world. So we’ve done this for vision, so that you’re allowed to move around and move your head, move your eyes freely, as someone would normally. And now we’re manoeuvring into haptics, where you can not only see the world as you move around, but you can reach out and touch it, and it feels like it really should in the real world. I’m Peter Scarfe. I’m a lecturer here in psychology, and my background has been in human visual perception.
39.3
So I was initially studying 3D vision, and then after I completed my PhD, I worked in Germany. And that’s when I first got introduced to equipment such as this, so the robotics and the haptics. Then, having moved on to Reading, I was able to combine both of those research interests, so the stuff I’d already done and the stuff I wanted to do with haptics. And we’re doing that in a collaboration between the virtual reality lab here in psychology and the haptic robotics lab over in engineering. So what we have here is essentially a demo of the technology which we’re using. So we have a virtual maze which we can actually rotate and move.
90.3
But the nice thing here is with our robot arm, we can also navigate through the maze and actually get haptic feedback. So if we just went into the door of the maze, and if I try and push through the surface, I’m not going to be able to get through, because we’re simulating the maze both visually and by touch. Also, we have an essentially glass surface on the top of the maze as well. So again, you can’t get through that. And the robot is much stronger than you, so no matter how hard you push, you won’t be able to get through.
123.5
And as you navigate through the maze, we have various different objects which you can actually feel the surface properties of as well. So I’m just navigating through some hoops now, and you can feel the smooth, shiny surface those hoops have. Similarly, we can have things like– kind of like molasses or pushing through jelly. So now I’m going down past the corridor, which is much harder to move through, but I can still push through it. And then I can go up here, and I’m going over a bumpy surface. So you can get very fine feedback about what you’re feeling. And around the corner again. And now I’ve got– like a magnet. So if I just tap.
170.9
See? It’s kind of like a magnetic force field, basically. So we can combine all of these different types of sensory feedback to look at how we perceive the world. The main thing we’re using haptic mostly for is some research on multisensory integration. So essentially, how you combine information from vision and touch. The really nice thing about this equipment is you are not limited by the physical world. Say, in the physical world, when you reach out to touch something, like a surface of a table, your vision and your touch are always telling you the same thing. But what we can do with the virtual reality and the haptics is we can actually put those two types of information into conflict.
221.2
So you could be reaching out to touch a surface which visually looks flat, but actually feels as if it’s slanted. And the reason we do that is it allows us to pull apart the contributions of different sources of sensory information to estimates of say, shape or distance or depth. And on a bigger picture, how you combine those sorts of information when you’re, say, navigating around the world. The really nice thing is that engineers and psychologists have a completely different skill set, but we’re actually studying the same thing. We’re both interested in sensory systems and how they combine information, whether that’s human sensory systems or robotic sensory systems.
270.9
So the collaboration’s great from our perspective, because we have these two different skill sets which provide very different things. So the engineers can build us all this custom equipment, and then we can actually use this custom equipment to kind of probe human brain function, essentially.

In this video Andrew Glennerster and Peter Scarfe explain how the Haptic Master, a multi sensory robotic arm, enables the user to explore the texture of different surfaces whilst navigating through the 3D computer simulation of a ‘haptic’ maze.

Peter explains why he became interested in haptics and explains how the Haptic Master, based in the Virtual Reality Lab in the School of Psychology at Reading is used in collaborative research with William Harwin’s Haptics Robotics Lab. He explains that although engineers and psychologists have different skills sets they are interested in similar questions.

The collaboration between the two groups is opening up new and exciting possibilities with the engineers designing, modifying and building haptic robotic devices that enable the psychologists to probe further into the function of the human brain.

This article is from the free online

Begin Robotics

Created by
FutureLearn - Learning For Life

Reach your personal and professional goals

Unlock access to hundreds of expert online courses and degrees from top universities and educators to gain accredited qualifications and professional CV-building certificates.

Join over 18 million learners to launch, switch or build upon your career, all at your own pace, across a wide range of topic areas.

Start Learning now