Skip main navigation

New offer! Get 30% off one whole year of Unlimited learning. Subscribe for just £249.99 £174.99. New subscribers only. T&Cs apply

Find out more

Brain controlled robots

Mahnaz explains the principles behind mind control of robots (brain patterns, mental activities)

An exciting, emerging area where biology meets robotics is in brain computer interfaces (BCIs). BCIs have the potential in the future to enable direct mind control of robots and other machines. This step explores the definition of a BCI and explains the key components.

Brain computer interfaces

In the past few years, BCIs have attracted a lot of attention from robotic groups, neuroscientists, computer scientists and neurologists, triggered by new scientific progress in understanding brain functions and by impressive applications.

BCI Definition: A BCI is a device that measures electrical activity in the brain or spinal cord, and transforms it into a synthetic output that can replace, restore or enhance natural function.
Based on this definition, a BCI system can control a robot or other assistive devices using our thoughts. Such a system can greatly help people with generalised paralysis to gain some level of independence.

Brain computer interfaces for robot control

The BCI input is the brain signals that carry informative neural features, recorded by electrodes either in the brain or on the head. The BCI outputs are used to control a device, such as an assistive robot, a wheelchair or a prosthetic hand.
Example: A person suffering from paralysis from the neck down would normally find independent mobility impossible. With a BCI system and a robotic wheelchair, the person can use mental imagery to imagine movement of their right hand in order to turn a wheelchair towards their right side, or their left hand to turn the chair to their left.

The BCI uses algorithms to translate the measured brain-wave activity into command signals to control the output device, i.e. in this case, the motors driving the wheelchair to the left or right.

One of the key challenges of BCIs to solve in the future is decoding brain-wave activity into desired actions. This problem is usually addressed with machine learning algorithms, known as ‘classification’ (also known as ‘pattern recognition’). This is where a particular set of brain-wave patterns are ‘classed’ as a specific action, e.g. move left or move right.

Key components of a BCI

The whole architecture of an online BCI system is summarised in the diagram below:

A diagram that shows the flow between the seven stages of using a BCI system - measurement of brain activity, preprocessing, feature extraction, classification, feedback, control a device - feeding back again into measurement of brain activity

The core components of a BCI system are as follows:

1. Measurement of brain activity
This part is responsible for recording brain activities using various types of sensors. After amplification and digitisation, the recorded brain signals serve as BCI inputs.

2. Preprocessing
This unit reduces noise and artifacts present in the brain signals in order to enhance the relevant information hidden in the input signals.

3. Feature extraction
The feature extractor transforms the preprocessed signals into feature values that correspond to the underlying neurological mechanism. These features are employed by BCI for controlling the output device.

4. Classification
This part is responsible for identifying the intention of the user from the extracted features.

5. Control a device
The output device can be a computer, a wheelchair or a robotic arm etc. The output of the classifier is used as a command to control the output device.

6. Feedback
The BCI should feedback the consequences of the action to the user, in a closed loop, so that the user can make adjustments. Feedback can be in visual, auditory or tactile form.

All these units are highly important in the development of an efficient BCI and affect the BCI performance in terms of accuracy, speed, and information transfer rate. A BCI must be designed to comfortably carry out this process without any harm to the user’s health.

© The University of Sheffield
This article is from the free online

Building a Future with Robots

Created by
FutureLearn - Learning For Life

Reach your personal and professional goals

Unlock access to hundreds of expert online courses and degrees from top universities and educators to gain accredited qualifications and professional CV-building certificates.

Join over 18 million learners to launch, switch or build upon your career, all at your own pace, across a wide range of topic areas.

Start Learning now