Skip to 0 minutes and 8 seconds Hi. I’m Maitreyee Wairagkar and I’m in my first year of PhD in cybernetics. I have recently graduated with MA in artificial intelligence and cybernetics from University of Reading. I was interested in developing technologies for human enhancement, especially for helping differently abled people, and my course in artificial intelligence and cybernetics really helped me do that. In the first year, we started learning about robotics using this simple robot, which has ultrasound sensors on it. It is able to detect the objects and avoid the objects, and thus navigate its way without hitting any objects. In the second year, we progressed to more advanced robotics, and this is the second year robot that we used for robotics challenge.
Skip to 0 minutes and 53 seconds It was a team-based activity where each team developed this robot with multiple sensors, and the goal was to complete different tasks, like measuring the seismic activity of the earthquake or getting the temperature of the volcano, mapping the area on an emulated planet. So the major challenge was getting the data from different sensors, like accelerometer, gyroscope for getting the orientation of the robot, and data of magnetometer, as well as temperature sensor and light sensor, and integrate all that information to control the behaviour of the robot to complete different tasks.
Skip to 1 minute and 30 seconds Along with the hardware of the robot, as well as the electronic components, we also learned about processing the information that we get from the robot, and also how to develop intelligent and emergent behaviours in the robot. Different modules like artificial intelligence, evolutionary computation, and swarm intelligence helped us to understand the mechanics behind the robots and how we can develop complicated behaviour in this robot in a nontraditional way. My PhD research is based on brain-computer interface. Brain-computer interfacing is a technology which enables human beings to control external objects like computers or robot directly by their brain waves.
Skip to 2 minutes and 8 seconds This technology is especially useful for people with severe motor disabilities where they can control external objects and interact with their environment by using their brain waves directly. This is an EEG headset which measures the brain waves and sends the signal to the computer. On the computer, we do signal processing on those brain waves and try to understand the intention of the person. Once that is detected, we can send that as a command to control external objects. It can be either a robot, it can be a computer, or it can even be a wheelchair. This is an example of a 3D printed prosthetic arm. We can control this using brain-computer interfaces.
Skip to 2 minutes and 54 seconds When a person thinks of making different gestures, that brain signal is identified and signal processed using the brain-computer interface and command is sent to this prosthetic arm to move different digits and make different gestures. This kind of robotic prosthetic device could also be enhanced by adding some sensors on the fingertips. We can have temperature sensors on the fingertips which can detect the temperature, and if the object touched is very hot, then the arm will retract automatically. We can also have gyroscope and accelerometers on different joints of this arm to adjust the orientation naturally. Another application of brain-computer interface is in robotic rehabilitation. The traditional way of rehabilitation therapy for stroke patients is that the therapist moves their limb passively.
Skip to 3 minutes and 46 seconds By repeating the movement they build the pathways in their brain to regain that particular moment. But in that way, the patient is not really engaged actively in the moment and the movement is done passively by someone else for them. But with robotic rehabilitation integrated with brain-computer interface, patient can think of performing the movement and robot can assist the patient to perform that movement for him, so it becomes more active engagement of the patient, which helps in faster recovery. This is soft robotics rehabilitation device, which works on inflation and deflation of the silicon device.
Skip to 4 minutes and 24 seconds We can, again, integrate it with brain-computer interface, and when a person wants to retract the arm, the air inside this soft robotic device will be compressed and it will help patient to move the arm. When patient wants to open their arm up, this device would be inflated automatically and it will facilitate this opening and closing of arm movement, which is helpful for rehabilitation. My undergraduate course was very research oriented. I did many summer internships to gain extra knowledge of the curriculum. I also got a chance to present my work in front of the students from different disciplines in British Undergraduate Research Conference in 2013.
Meet the robots at Reading: a student's perspective
Meet Maitreyee Wairagkar, who graduated from the University of Reading with a degree in Cybernetics. Maitreyee continued to study at Reading, and was awarded MEng in Artificial Intelligence and Cybernetics. She is now studying for her PhD in cybernetics.
In this video Maitreyee introduces several of the Reading robots that she’s worked with during her undergraduate and postgraduate career. Her current research focuses on development technologies for human enhancement to help differently abled people.
She demonstrates a 3D printed, prosthetic arm which can be controlled using a brain computer interface. It can be enhanced to include temperature sensors on the fingertips and gyroscopes and accelerometers on different joints to give the limb more natural movements.
She also demonstrates a soft robotics rehabilitation device that can be combined with a brain computer interface, to help patients actively engage in moving their arm during rehabilitation after they’ve suffered a stroke.
© University of Reading