Skip main navigation

New offer! Get 30% off one whole year of Unlimited learning. Subscribe for just £249.99 £174.99. New subscribers only. T&Cs apply

Find out more

Mobile motion capture

Can we take motion capture outside the lab? Read about the use of accelerometers, gyroscopes and magnetometers and Xsens suits in fieldwork.
A large bundle of different instrument cables.

Sometimes, it is relevant to capture the movement of performers or perceivers outside of a laboratory setting. This can be for research purposes – studying how performers or perceivers move in a real concert; or it can be an interactive part of the performance, tracking the movement of a performer or perceiver to influence visual displays or the musical sound. This article presents some of the technologies you will see in the next video step.

The most precise way of recording motion is in a dedicated motion capture (mocap) laboratory with optical infrared cameras and reflective markers. However, there are several reasons why we might want to use other types of motion capture systems to studying music-related motion:

  • cutting edge camera-based mocap technologies are expensive
  • setting up a camera-based system in a “real” setting (such as a concert hall) is visually distracting and might be disturbing to the performer
  • lighting conditions are often less than ideal when measuring people’s body motion in a “real” situation (as opposed to in a mocap lab).

Inertial measurement units (IMUs)

Inertial sensors operate on the physical principle of inertia:

  • Accelerometers measure acceleration
  • Gyroscopes measure the amount of rotation of the sensor.
  • Magnetometers measure the orientation in relation to the earth’s magnetic field (as a compass)

Magnetometers are strictly speaking based on magnetic sensing, not inertial, but are often included in what is called inertial measurement units (IMU). An IMU is inside most modern smart phones and tablets, and makes it possible to estimate orientation and acceleration.

The Xsens Suit

At the University of Oslo we have an Xsens suit, which consists of 17 IMUs in combination with a kinematic model of the human body. The kinematic model restricts the possible position of each of the IMUs, and thus facilitates calculating the position of each sensor based on the acceleration and rotation data provided by the sensors.

The Xsens suit makes it possible to record motion in any location. It can also be used in real time and the motion data can be used as an interactive element of a music performance. Here is an example of the Xsens suit used in performance. In the video, the performer is controlling a number of synthesizers with his movements. The data is fed to a program called Ableton Live, where certain parts of the music are pre-composed some parts are controlled by the performer (e.g. triggering new sections and selecting between different chords). We will look more closely at such interactive music technologies in the methods track in week 6.

Suggested literature

© University of Oslo
This article is from the free online

Music Moves: Why Does Music Make You Move?

Created by
FutureLearn - Learning For Life

Reach your personal and professional goals

Unlock access to hundreds of expert online courses and degrees from top universities and educators to gain accredited qualifications and professional CV-building certificates.

Join over 18 million learners to launch, switch or build upon your career, all at your own pace, across a wide range of topic areas.

Start Learning now