Sensing Music-related Actions (completed)

Sensing Music-related Actions (SMA) is an interdisciplinary research project focusing on capturing and understanding more about body movements in a musical context. This is done through the combination of methods from musicology and informatics, and the researchers are working with advanced motion capture technologies and machine learning techniques.

About the project

This interdisciplinary project, combining scientific, technological and artistic methods, proposes three crucial topics for future development of multimodal devices:

  • the importance of action-sound couplings for both performance and perception of music
  • the control potential of human bodily movement
  • a move towards an active music experience

Research questions and methods

The basic research question that underlies this project is:

  • How can action-sound couplings be exploited in human-computer interaction?

A number of sub-questions are linked to this:

  • Which sensor technologies can be used to capture complex body movement?
  • Which machine-learning techniques can be used to extract semantic actions from a continuous stream of movement data?
  • How are actions and sounds coupled in everyday life?
  • How can such theories and technologies be used to control sound and music?
  • How can we create musical structures that can be controlled by the user?

Aims and objectives

Principal objective:

  • Exploring action-sound couplings in human-computer interaction.

Sub-goals:

  • Develop sensor technologies for capturing complex body movement.
  • Develop machine-learning techniques and segmentation methods for extracting semantic actions from a continuous stream of sensor data.
  • Develop theories of action-sound couplings in everyday life.
  • Develop prototypes of enactive media devices that allow for continuously controlling music based on the actions of the user.
  • Develop hypermusic structures that can be used in the enactive media devices.

Cooperation and financing

Sensing Music-related Actions is a joint research project of the departments of Musicology and Informatics, and has received external funding through the VERDIKTprogram of the The Research Council of Norway. The project runs from July 2008 until December 2012.

Publications

  • Alexander Refsum Jensenius (2013). Kinectofon: Performing with Shapes in Planes, In Wakefield Graham; Woon Seung Yeo; Haru Ji; Alexander Sigman & Kyogu Lee (ed.),  Proceedings of the International Conference on New Interfaces For Musical Expression.  Korea Advance Institute of Science and Technology.  Paper.  s 196 - 197
  • Jim Tørresen; Yngve Hafting & Kristian Nymoen (2013). A new wi-fi based platform for wireless sensor data collection. In Proceedings of the International Conference on New Interfaces For Musical Expression, In Kyogu Lee; Alexander Sigman; Haru Ji; Wakefield Graham & Woon Seung Yeo (ed.),  Proceedings of the International Conference on New Interfaces For Musical Expression.  Korea Advance Institute of Science and Technology.  22.  s 337 - 340
  • Alexander Refsum Jensenius (2013). Non-Realtime Sonification of Motiongrams, In Roberto Bresin (ed.),  Proceedings of the Sound and Music Computing Conference 2013, SMC 2013, Stockholm, Sweden.  KTH.  ISBN 978-91-7501-831-7.  KAPITTEL.  s 500 - 505
  • Ståle Andreas van Dorp Skogstad; Kristian Nymoen; Mats Erling Høvin; Sverre Holm & Alexander Refsum Jensenius (2013). Filtering Motion Capture Data for Real-Time Applications, In Wakefield Graham; Woon Seung Yeo; Kyogu Lee; Alexander Sigman & Haru Ji (ed.),  Proceedings of the International Conference on New Interfaces For Musical Expression.  Korea Advance Institute of Science and Technology.  Kapittel.  s 142 - 147
  • Alexander Refsum Jensenius (2013). Non-Realtime Sonification of Motiongrams, In Roberto Bresin (ed.),  Proceedings of the Sound and Music Computing Conference 2013.  Logos Verlag Berlin.  ISBN 978-3-8325-3472-1.  Chapter.  s 500 - 505
  • Rolf Inge Godøy (2013). Quantal Elements in Musical Experience, In Rolf Bader (ed.),  Sound - Perception - Performance.  Springer.  ISBN 9783319001067.  4.  s 113 - 128
  • Kristian Nymoen; Arve Voldsund; Ståle Andreas van Dorp Skogstad; Alexander Refsum Jensenius & Jim Tørresen (2012). Comparing Motion Data from an iPod Touch to a High-End Optical Infrared Marker-Based Motion Capture System, In Brent Gillespie; Georg Essl; Sile O’Modhrain & Michael Gurevich (ed.),  Proceedings of the International Conference on New Interfaces for Musical Expression.  University of Michigan Press.  ISBN 978-0-9855720-1-3.  Artikkel i Proceedings.  s 88 - 91
  • Alexander Refsum Jensenius & Arve Voldsund (2012). The Music Ball Project: Concept, Design, Development, Performance, In Sile O’Modhrain; Georg Essl; Michael Gurevich & Brent Gillespie (ed.),  Proceedings of the International Conference on New Interfaces for Musical Expression.  University of Michigan Press.  ISBN 978-0-9855720-1-3.  Article.  s 300 - 303

View all works in Cristin

  • Alexander Refsum Jensenius; Rolf Inge Godøy; Anders Tveit & Dan Overholt (ed.) (2011). Proceedings of the International Conference on New Interfaces for Musical Expression. Universitetet i Oslo.  ISBN 978-82-991841-6-8.  586 s.
  • Alexander Refsum Jensenius; Ståle Andreas van Dorp Skogstad; Anette Forsbakk & Alison Bullock Aarsten (ed.) (2011). Program book of the International Conference on New Interfaces for Musical Expression. Universitetet i Oslo og Norges musikkhøgskole.  92 s.

View all works in Cristin

Published Aug. 21, 2012 11:42 AM - Last modified Apr. 6, 2018 7:57 PM