Brain controlled robots

Over the last three decades, Brain-Computer Interface (BCI) has attracted a lot of attention from robotic groups, neuroscientists, computer scientists and neurologists, triggered by new scientific progress in understanding brain functions and by impressive applications.

The definition of BCI as quoted by Professor Wolpaw in ‘Brain-computer interfaces: principles and practice’ (2012) is:

” A system that measures central nervous system (CNS) activity and converts it into artificial output that replaces, restores, enhances, supplements, or improves natural CNS output and thereby changes the ongoing interactions between the CNS and its external or internal environment.”
Quoted with kind permission from Oxford University Press.

Based on this definition, a BCI system can control a robot or other assistive devices using our thoughts. Such a system can greatly help people with generalised paralysis to gain some level of independence. The BCI input is the brain signals carrying informative neural features. The BCI outputs are used to control a device, such as an assistive robot, a wheelchair or a prosthetic hand.

As an example, in a BCI system, the user may be asked to imagine movement of their right hand in order to turn a wheelchair towards their right side, or their left hand to turn the chair to their left. Each BCI uses specific algorithms to translate its input into command signals to control the output device.

The whole architecture of an online BCI system is summarised in the diagram below:

A diagram that shows the flow between the seven stages of using a BCI system - measurement of brain activity, preprocessing, feature extraction, classification, feedback, control a device - feeding back again into measurement of brain activity

The core components of a BCI system are as follows:

1. Measurement of brain activity

This part is responsible for recording brain activities using various types of sensors. After amplification and digitisation, the recorded brain signals serve as BCI inputs.

2. Preprocessing

This unit reduces noise and artifacts present in the brain signals in order to enhance the relevant information hidden in the input signals.

3. Feature extraction

The feature extractor transforms the preprocessed signals into feature values that correspond to the underlying neurological mechanism. These features are employed by BCI for controlling the output device.

4. Classification

This part is responsible for identifying the intention of the user from the extracted features.

5. Control a device

The output device can be a computer, a wheelchair or a robotic arm etc. The output of the classifier is used as a command to control the output device.

6. Feedback

Ideally, BCI should be a closed loop system such that the system shows the output (the identified mental state) to the user after processing the brain signals. Feedback can be in visual, auditory or tactile form, and helps the user control their brain activities and adapt accordingly to enhance the overall performance of BCI.

All these units are highly important in the development of an efficient BCI and affect the BCI performance in terms of accuracy, speed, and information transfer rate. A BCI must be designed to comfortably carry out this process without any harm to the user’s health.


References

Pfurtscheller, G., Neuper, C., Guger, C., Harkam, W., Ramoser, H., Schlogl, A., Obermaier, B. and Pregenzer, M. (2000). Current trends in Graz brain-computer interface (BCI) research. IEEE Transactions on Rehabilitation Engineering, 8(2), pp.216-219.

Vanacker, G., Millán, J., Lew, E., Ferrez, P., Moles, F., Philips, J., Van Brussel, H. and Nuttin, M. (2007). Context-Based Filtering for Assisted Brain-Actuated Wheelchair Driving. Computational Intelligence and Neuroscience, pp.1-12.

Wolpaw JR and Wolpaw E.W., editors. (2012). Brain-computer interfaces: principles and practice. Oxford: Oxford University Press. pp3-12. Available online

Share this article:

This article is from the free online course:

Building a Future with Robots

The University of Sheffield

Get a taste of this course

Find out what this course is like by previewing some of the course steps before you join: