Alexander Refsum Jensenius

Alexander Refsum Jensenius

Professor of music technology, University of Oslo. Director of the fourMs Lab and Director of RITMO Centre for Interdisciplinary Studies in Rhythm, Time, and Motion

Location University of Oslo, Norway

Activity

  • It is essential to check the license on the data/media you want to use. Some licenses only allow sharing, others allow making derivatives, and others don't allow for anything. The challenge is that much data/media does not have any licenses attached, and then it is necessary to ask the copyright holder about how you can use the material. As a courtesy to...

  • Yes, facial information may be essential for research (like in emotions research) but may be difficult to handle from a privacy point of view. The key is to get consent for whatever you will use the recording for.

  • Sorry for the late reply. Please be aware that there are both legal (like GDPR) and ethical concerns. Different institutions handle these differently; sometimes privacy and ethics are evaluated together, but otherwise not. Getting a guest researcher status may help to get support!

  • Trying to cover (basic) coding in this course would be too much. There are many other courses you can take on that topic.

  • Yes, EMG is not a straightforward technique. It measures the electrical activity of muscles, but it highly depends on the placement of the electrodes. That is why it is also challenging to interpret what you get. Motion capture is easier to work with, since you can more directly see what you measure.

  • Yes, I have also been exploring the use of the Physics Toolbox Suite. On my (Android) phone, I can export the recorded data and send it to my e-mail or a storage solution.

  • Yes, check out the papers, then you will get more information about the mappings and development process.

  • Yes! :)

  • Thanks for the feedback, we will try to improve the figure for the next run.

  • I am primarily working on Linux these days, focusing on developing the [Musical Gestures Toolbox for Python](https://github.com/fourMs/MGT-python). This toolbox works well on Mac, Windows and Linux (and so does the Matlab version, although we no longer add new features there). We work with PD for many things (particularly interactive music systems) but not for...

  • Happy it was helpful for you. Good luck with using your new knowledge in real-life projects!

  • Yes, this is very complex motion capture and probably requires quite a lot of post-processing!

  • Yes, it is quite a lot of things to think about. However, in my experience, most people get into it relatively quickly once they start using the equipment.

  • The sensitivity of the cameras and the brightness of the infrared light can typically be adjusted in software, either for all or individual cameras. Some cameras also have interchangeable lenses, which may further influence the light level.

  • Yes, this looks like a setup that can result in good recordings.

  • You are correct in that using a black cover over reflective elements can work well. So, yes, it is not necessary to have a completely empty space. But often, there are more reflective things in a space than one thinks, so we often try to remove as much as possible before calibrating. For larger setups, including pianos, etc., it is obviously not so easy to...

  • Ah, I think there is a terminology problem here. The body in "rigid body" refers to a static object, not a human body (part). In our lab, we work with small plastic plates with pins sticking out on the sides that markers can be fastened to. We have many of these, with slightly different placement of the pins. Then it is possible to identify each plate. It may...

  • Yes, indeed. Testing various solutions and choosing the most efficient one is a good way of doing it.

  • I see now that Qualisys has an underwater mocap solution: https://www.qualisys.com/cameras/underwater/

  • Great, thanks for joining!

  • Thanks for mentioning this. Even though it may take some time at the beginning of the project, creating a DMP early on may save you a lot of time later!

  • Thanks for sharing. Yes, this is a great example of how a very limited type of motion capture (based on the [Genki ring](https://genkiinstruments.com/)) can lead to exciting artistic results!

  • Cool, thanks for sharing!

  • Yes and no. It is tricky at first, but once you get more experienced it is quite fast. The best is to capture good data first, then you may not need to do much post-processing at all! :)

  • Yes, then you have to determine which is which in post-processing.

  • There is no standard approach to marker placement; it depends on what you want to capture. [Here](http://mocap.cs.cmu.edu/markerPlacementGuide.pdf), you can see a quite standard setup and labelling.

  • Yes, they can be. Some more affordable solutions are available if you can rely on web cameras and open source software, but most commercially available systems are expensive.

  • Yes, indeed!

  • I don't have any experience with mocap in water myself, but I would imagine that it is quite complicated to calibrate! Would be interesting to hear if someone else has any experience!

  • Thanks for the nice reference to Maxime Sheets-Johnstone.

  • Sorry about the delay. The wrap-up video is online now! :)

  • Yes, that is often the case!

  • Yes, we miss the Myo sensors! Fortunately, one of our postdoctoral fellows has decided to start production of a similar (and better!) device: https://sifilabs.com/

  • Yes, correct. Most software will give you a score of the fill level of each marker trajectory. So it is best to start with the ones with high fill levels and work your way down the list.

  • These are all good questions!

    It is possible to rotate the 3D image in motion capture software. In fact, it is often necessary to rotate back and forth to understand the direction of the body at a particular time. We always label with respect to the body. I typically try to "project" myself into the person I am marking up to get the directions right....

  • Yes, dance is the trickiest to record. I think you have many good thoughts about how to create a good recording!

  • Thanks for notifying us! I have updated the article with the correct figures now.

  • If you are mainly interested in head-tracking there are some new and affordable solutions, e.g. this head-band with MIDI support: https://supperware.co.uk/

  • Great!

  • Great to have you hear. Hopefully, you learn enough to start exploring various systems yourself!

  • I don't know about any of these systems. For casual usage, I think something like OpenPose may work well: https://cmu-perceptual-computing-lab.github.io/openpose/web/html/doc/ - it requires some coding skills, though.

  • Yes, swimming is tricky from a motion capture perspective, but there are people doing underwater motion capture!

  • I didn't know about PhysCap, thanks for sharing. There is also OpenPose that can track 3D motion from only a video camera: https://cmu-perceptual-computing-lab.github.io/openpose/web/html/doc/