Sensing Music-related Actions (completed)
Sensing Music-related Actions (SMA) is an interdisciplinary research project focusing on capturing and understanding more about body movements in a musical context. This is done through the combination of methods from musicology and informatics, and the researchers are working with advanced motion capture technologies and machine learning techniques.
About the project
This interdisciplinary project, combining scientific, technological and artistic methods, proposes three crucial topics for future development of multimodal devices:
- the importance of action-sound couplings for both performance and perception of music
- the control potential of human bodily movement
- a move towards an active music experience
Research questions and methods
The basic research question that underlies this project is:
- How can action-sound couplings be exploited in human-computer interaction?
A number of sub-questions are linked to this:
- Which sensor technologies can be used to capture complex body movement?
- Which machine-learning techniques can be used to extract semantic actions from a continuous stream of movement data?
- How are actions and sounds coupled in everyday life?
- How can such theories and technologies be used to control sound and music?
- How can we create musical structures that can be controlled by the user?
Aims and objectives
- Exploring action-sound couplings in human-computer interaction.
- Develop sensor technologies for capturing complex body movement.
- Develop machine-learning techniques and segmentation methods for extracting semantic actions from a continuous stream of sensor data.
- Develop theories of action-sound couplings in everyday life.
- Develop prototypes of enactive media devices that allow for continuously controlling music based on the actions of the user.
- Develop hypermusic structures that can be used in the enactive media devices.
Cooperation and financing
Sensing Music-related Actions is a joint research project of the departments of Musicology and Informatics, and has received external funding through the VERDIKTprogram of the The Research Council of Norway. The project runs from July 2008 until December 2012.
Jensenius, Alexander Refsum (2013). Non-Realtime Sonification of Motiongrams. In Bresin, Roberto (Eds.), Proceedings of the Sound and Music Computing Conference 2013, SMC 2013, Stockholm, Sweden. KTH. ISSN 978-91-7501-831-7. p. 500–505. Full text in Research Archive
Jensenius, Alexander Refsum (2013). Kinectofon: Performing with Shapes in Planes. In Yeo, Woon Seung; Lee, Kyogu; Sigman, Alexander; Ji, Haru & Graham, Wakefield (Ed.), Proceedings of the International Conference on New Interfaces For Musical Expression. Korea Advance Institute of Science and Technology. ISSN 2220-4792. p. 196–197. Full text in Research Archive
Tørresen, Jim; Hafting, Yngve & Nymoen, Kristian (2013). A new wi-fi based platform for wireless sensor data collection. In Proceedings of the International Conference on New Interfaces For Musical Expression. In Yeo, Woon Seung; Lee, Kyogu; Sigman, Alexander; Ji, Haru & Graham, Wakefield (Ed.), Proceedings of the International Conference on New Interfaces For Musical Expression. Korea Advance Institute of Science and Technology. ISSN 2220-4792. p. 337–340.
Nymoen, Kristian; Tørresen, Jim; Godøy, Rolf Inge; Jensenius, Alexander Refsum & Høvin, Mats Erling (2013). Methods and Technologies for Analysing Links Between Musical Sound and Body Motion. Akademisk Forlag. ISSN 1501-7710. Show summary