Alexander Refsum Jensenius

Alexander Refsum Jensenius

Professor of music technology, University of Oslo. Director of the fourMs Lab and Director of RITMO Centre for Interdisciplinary Studies in Rhythm, Time, and Motion

Location University of Oslo, Norway

Activity

  • It is essential to check the license on the data/media you want to use. Some licenses only allow sharing, others allow making derivatives, and others don't allow for anything. The challenge is that much data/media does not have any licenses attached, and then it is necessary to ask the copyright holder about how you can use the material. As a courtesy to...

  • Yes, facial information may be essential for research (like in emotions research) but may be difficult to handle from a privacy point of view. The key is to get consent for whatever you will use the recording for.

  • Sorry for the late reply. Please be aware that there are both legal (like GDPR) and ethical concerns. Different institutions handle these differently; sometimes privacy and ethics are evaluated together, but otherwise not. Getting a guest researcher status may help to get support!

  • Trying to cover (basic) coding in this course would be too much. There are many other courses you can take on that topic.

  • Yes, EMG is not a straightforward technique. It measures the electrical activity of muscles, but it highly depends on the placement of the electrodes. That is why it is also challenging to interpret what you get. Motion capture is easier to work with, since you can more directly see what you measure.

  • Yes, I have also been exploring the use of the Physics Toolbox Suite. On my (Android) phone, I can export the recorded data and send it to my e-mail or a storage solution.

  • Yes, check out the papers, then you will get more information about the mappings and development process.

  • Yes! :)

  • Thanks for the feedback, we will try to improve the figure for the next run.

  • I am primarily working on Linux these days, focusing on developing the [Musical Gestures Toolbox for Python](https://github.com/fourMs/MGT-python). This toolbox works well on Mac, Windows and Linux (and so does the Matlab version, although we no longer add new features there). We work with PD for many things (particularly interactive music systems) but not for...

  • Happy it was helpful for you. Good luck with using your new knowledge in real-life projects!

  • Yes, this is very complex motion capture and probably requires quite a lot of post-processing!

  • Yes, it is quite a lot of things to think about. However, in my experience, most people get into it relatively quickly once they start using the equipment.

  • The sensitivity of the cameras and the brightness of the infrared light can typically be adjusted in software, either for all or individual cameras. Some cameras also have interchangeable lenses, which may further influence the light level.

  • Yes, this looks like a setup that can result in good recordings.

  • You are correct in that using a black cover over reflective elements can work well. So, yes, it is not necessary to have a completely empty space. But often, there are more reflective things in a space than one thinks, so we often try to remove as much as possible before calibrating. For larger setups, including pianos, etc., it is obviously not so easy to...

  • Ah, I think there is a terminology problem here. The body in "rigid body" refers to a static object, not a human body (part). In our lab, we work with small plastic plates with pins sticking out on the sides that markers can be fastened to. We have many of these, with slightly different placement of the pins. Then it is possible to identify each plate. It may...

  • Yes, indeed. Testing various solutions and choosing the most efficient one is a good way of doing it.

  • I see now that Qualisys has an underwater mocap solution: https://www.qualisys.com/cameras/underwater/

  • Great, thanks for joining!

  • Thanks for mentioning this. Even though it may take some time at the beginning of the project, creating a DMP early on may save you a lot of time later!

  • Thanks for sharing. Yes, this is a great example of how a very limited type of motion capture (based on the [Genki ring](https://genkiinstruments.com/)) can lead to exciting artistic results!

  • Cool, thanks for sharing!

  • Yes and no. It is tricky at first, but once you get more experienced it is quite fast. The best is to capture good data first, then you may not need to do much post-processing at all! :)

  • Yes, then you have to determine which is which in post-processing.

  • There is no standard approach to marker placement; it depends on what you want to capture. [Here](http://mocap.cs.cmu.edu/markerPlacementGuide.pdf), you can see a quite standard setup and labelling.

  • Yes, they can be. Some more affordable solutions are available if you can rely on web cameras and open source software, but most commercially available systems are expensive.

  • Yes, indeed!

  • I don't have any experience with mocap in water myself, but I would imagine that it is quite complicated to calibrate! Would be interesting to hear if someone else has any experience!

  • Thanks for the nice reference to Maxime Sheets-Johnstone.

  • Sorry about the delay. The wrap-up video is online now! :)

  • Yes, that is often the case!

  • Yes, we miss the Myo sensors! Fortunately, one of our postdoctoral fellows has decided to start production of a similar (and better!) device: https://sifilabs.com/

  • Yes, correct. Most software will give you a score of the fill level of each marker trajectory. So it is best to start with the ones with high fill levels and work your way down the list.

  • These are all good questions!

    It is possible to rotate the 3D image in motion capture software. In fact, it is often necessary to rotate back and forth to understand the direction of the body at a particular time. We always label with respect to the body. I typically try to "project" myself into the person I am marking up to get the directions right....

  • Yes, dance is the trickiest to record. I think you have many good thoughts about how to create a good recording!

  • Thanks for notifying us! I have updated the article with the correct figures now.

  • If you are mainly interested in head-tracking there are some new and affordable solutions, e.g. this head-band with MIDI support: https://supperware.co.uk/

  • Great!

  • Great to have you hear. Hopefully, you learn enough to start exploring various systems yourself!

  • I don't know about any of these systems. For casual usage, I think something like OpenPose may work well: https://cmu-perceptual-computing-lab.github.io/openpose/web/html/doc/ - it requires some coding skills, though.

  • Yes, swimming is tricky from a motion capture perspective, but there are people doing underwater motion capture!

  • I didn't know about PhysCap, thanks for sharing. There is also OpenPose that can track 3D motion from only a video camera: https://cmu-perceptual-computing-lab.github.io/openpose/web/html/doc/

  • Swimming is an interesting–but tricky–case. If you have windows in the pool, you can probably do some video-based analysis. Some of the more expensive mocap systems can be used underwater.

  • With infrared systems, there are two solutions: (1) making "rigid bodies" with a constellation of markers (usually 3 or 4), (2) using active markers that emit light themselves (but these require a power pack). Both of these would make the setup more bulky. We will get to inertial systems later, and there you don't have this problem.

  • Knitting is a great example and is very complex both spatially and temporally. It would be great to do some analysis of such an activity!

    Last year, I did a project called [365 Sound Actions](https://www.arj.no/2022/12/31/365-sound-actions/) and one of the recordings was of [knitting](https://www.youtube.com/watch?v=NvAVKQAC4Io). Will try to make some...

  • Ah, yes, good point. I am no Laban expert, but perhaps there are some others here that could share links to examples of notation and related motion?

  • Yes, we will explain more in the coming weeks! Looking at velocity is a good idea. I often calculate the "quantity of motion" as a measure of how much you move. It is a simple measure but may give you a sensor of what is going on.

    As for software needs, it varies. There are tools that are quite simple to start with, both commercial and open source. One of...

  • Realtime motion tracking is challenging but lots of fun! We will talk more about that later on in the course.

  • We have some OptiTrack systems, too. They are more portable than a complete Qualisys system, so we typically use them for out-of-lab settings.

  • @VickyFisher As we will discuss later in this course, there are some challenges with optical systems in dance (particularly with occlusion). Then sensor-based suits may work better, although they are more obtrusive (need to wear sensors and suits).

  • There are good resources in Plymouth, so it would be great if you could access those.

    I didn't know about the Empathic AI toolkit; thanks for sharing!

  • Yes, we are working with a Qualisys system in Oslo and it works very well in a lab context.

  • Yes, systematic qualitative approaches, like Laban Movement Analysis, are great for capturing elements that may not be easily capturable with quantitative systems. And vice versa. The best is to combine different methods, I think.

  • Yes, I think both sensor-based and camera-based systems "count" as motion capture!

  • Welcome to the course! This is the second run, and we are eager to have a new group of learners join us!

  • Thanks for the message, we will try to find another clip that works.

  • Yes, taking a video is a good idea!

  • Good question, hopefully you will have an answer in a couple of weeks. :)

  • Yes, indeed. There have been some brain studies of musicians. Take a look at this article: https://www.frontiersin.org/articles/10.3389/fnhum.2020.576888/full

  • Yes, please send us feedback on what worked (or didn't work) for you. We love to get feedback to improve future runs of the course.

  • We typically use standstill as a baseline. It is possible to consider other baselines, e.g. walking or some other repetitive activity. In our studies, we also screen for medical conditions, etc. In our lab, we mainly work with healthy adults. We are currently working on various sonification approaches to help people listen to their own body motion.

  • Yes, I guess it can, although you would need to know what you are looking for. So the visualization is as powerful as the eye using them. :)

  • I don't know of people using it on historical material, but there is nothing that prevents it. I would be curious to see the results!

  • As long as you have a relatively high sampling rate (100 Hz or more), there shouldn't be any problems with the calibration. We typically suggest to use the same sampling rate for calibration as for the capture.

  • Sorry for the late reply, but we don't moderate the discussion after the initial course run is done. To answer your questions: we don't offer any automatic ways of overlaying the different outputs. But they are standard images/videos/data files that you can use in different ways with your other software.

  • Great question! I have only seen some news stories about ABBA's performances. Has anyone seen them "live"?

  • I guess the question here is how you define a musical instrument! My (long) answer can be found in my upcoming book: https://mitpress.mit.edu/books/sound-actions

  • Yes, the Reactable is great. I didn't know That One Guy, thanks for sharing!

  • Ah, yes, these are good questions! In fact, so good that I have written a book about them (and others). It will be out soon: https://mitpress.mit.edu/books/sound-actions

  • Yes, indeed. You can check out some of our latest development when it comes to "air guitar" performance here: https://www.youtube.com/watch?v=-_wgBZY2iF8

  • Ah, yes, I think there are many different types of metaphors that can be used. You may find it interesting to have a look at the introduction of Tejaswinee Kelkar's dissertation: https://www.duo.uio.no/handle/10852/71043 in which she writes about different approaches to our understanding of pitch and melody.

  • Great to have you with us!

  • Yes, animal sound (and music) is an interesting topic. For some more on this topic, check out this book by Henkjan Honing: https://mitpress.mit.edu/books/evolving-animal-orchestra

  • I know of several studies of various types of musical features in Indian classical music, but none focused on groove. Would be interesting to hear if someone has any examples!

  • Nice example, thanks for sharing.

  • Yes, those terms are somewhat tricky to differentiate. Try re-reading the relevant sections! :)

  • Yes, I agree!

  • Great that you enjoyed the course. Good luck with your data collection!

  • Yes, good point. For videos, we have often generated motion videos, such as this one: https://www.youtube.com/watch?v=aQ7tjztMI2M. This makes it possible to get a sense of the motion even though the dancer's identity is covered. New tools such as deface (https://github.com/ORB-HD/deface) allow for blurring faces automagically.

  • There are certainly many copyright issues to deal with when working with musical material: composers, lyricists, performers, producers, etc. This makes it very difficult to meet open research requirements. I think it will be important to make clearer license models (also beyond creative commons licenses) and systems for handling copyright issues for researchers.

  • Yes, I agree. We often think that musicians have particular *sonic* styles, but these are also linked to *movement* styles. Would be interesting to do some more systematic analysis of this!

  • Yes, very good point. It is difficult to control for this and find reliable ways of reporting/measuring the use of alcohol/drugs. I am not aware of any mocap studies where this has been controlled for systematically, but perhaps someone else has any pointers?

  • Cool study!

  • Welcome! Great question about what influences our actions, but not easy to answer.

  • Ah, only fixed the link in the text. Have fixed the other link too now. Thanks for letting us know.

  • Thanks, have fixed the URL now.

  • The MYO is/was a brand name, but obviously inspired by electromyography. And, yes, muscle sensing is used in many different domains these days!

  • That is a good point. We are doing so many different types of studies, so it is difficult to generalize here. But for hypothesis-driven studies, we have started doing pre-registrations: https://osf.io/prereg/

  • Yes, I think there is a lot of overlaps. All the new DMP templates differ slightly, but they try to formalize more or less the same legal requirements.

  • Have fixed the link now, thanks for notifying us!

  • Yes. When doing the post-processing it often helps to watch the trajectories as they unfold on screen and use this to identify the ones to connect. But sometimes it is very hard! Then is may help to look at asymmetric markers.

  • Good question! I cannot think of any mocap studies of polyrhythms or multiple rhythms right away, but perhaps someone else knows any?

  • Human micromotion when standing still is around 5-10 mm/s, while the mocap systems we work with can reliably capture around 1 mm/s (dependent on setup and calibration). We have done some measurements of different systems here: http://urn.nb.no/URN:NBN:no-31295

  • We have tried with several different types of music over the years, but have tried to avoid singing. The main reason for that is to focus on musical features and not get into lyrics (which is super-interesting but also much more complicated).

  • You mean for capturing skeleton-like images? If you can program, something like OpenPose may be a starting point: https://github.com/CMU-Perceptual-Computing-Lab/openpose

    If you are more interested in general motion visualization, take a look at some of our software: https://www.uio.no/ritmo/english/research/labs/fourms/software/musicalgesturestoolbox/

  • From our perspective of "embodied musicology", bodily and pleasurable experiences are at the core of musical activities. Not all scholars agree that this is important, though...

  • Good question! Some musicians have a very clear understanding of the music theory behind what they are playing. Others don't but they can still play it.

  • Yes, it is tricky at first!

  • Thanks for notifying about the broken link. Have updated the Franz Ferdinand video now.