In this project we have examined best practice methods for filtering MoCap data for real-time applications; i.e. the potential for using low group delay filters and their appropriate design methods.
Subprojects of the Sensing Music-related Actions project.
A project in Sound programming 2 , University of Oslo, Spring 2010.
Sound Tracing, as we define it, entails rendering the perceptual qualities of short sound objects through bodily motion. The underlying assumption and motivation of sound tracing experiments is that they may reveal salient sound-motion links in perception.
A preliminary study on sound tracing was carried out in 2006, where the responses of participants were collected on a digital tablet.
To follow up on the previous experiment, two experiments have been carried out in 2009 and 2010, where response data was gathered using optical infrared marker-based motion capture.
The SoundSaber is a musical instrument based on optical marker-based motion capture technology. The instrument demonstrates how a quite simple synthesiser can be quite intriguing to interact with, especially when the mappings between motion features and sound features is well designed.
In the Dance Jockey project, we have used a full body inertial motion capture system, the Xsens MVN suit, for musical interaction. The name Dance Jockey is a word play on the well-known term Disc Jockey, or DJ. This name is meant to reflect that instead of using discs to perform music, we use dance or full body motion as the basis for the performance.