Papers at NIME 2013

fourMs researchers are presenting three papers at this year’s NIME conference in Daejeon.

Skogstad, S. A., Nymoen, K., Høvin, M., Holm, S., and Jensenius, A. R. (2013). Filtering motion capture data for real-time applications. In Proceedings of the International Conference on New Interfaces For Musical Expression, pages 196–197, Daejeon, Korea.

In this paper we present some custom designed filters for real-time motion capture applications. Our target application is motion controllers, i.e. systems that interpret hand motion for musical interaction. In earlier research we found effective methods to design nearly optimal filters for realtime applications. However, to be able to design suitable filters for our target application, it is necessary to establish the typical frequency content of the motion capture data we want to filter. This will again allow us to determine a reasonable cutoff frequency for the filters. We have therefore conducted an experiment in which we recorded the hand motion of 20 subjects. The frequency spectra of these data together with a method similar to the residual analysis method were then used to determine reasonable cutoff frequencies. Based on this experiment, we propose three cutoff frequencies for different scenarios and filtering needs: 5, 10 and 15 Hz, which correspond to heavy, medium and light filtering, respectively. Finally, we propose a range of real-time filters applicable to motion controllers. In particular, low-pass filters and low-pass differentiators of degrees one and two, which in our experience are the most useful filters for our target application.


Jensenius, A. R. (2013). Kinectofon: Performing with shapes in planes. In Proceedings of the International Conference on New Interfaces For Musical Expression, pages 196–197, Daejeon, Korea.

The paper presents the Kinectofon, an instrument for creating sounds through free-hand interaction in a 3D space. The instrument is based on the RGB and depth image streams retrieved from a Microsoft Kinect sensor device. These two image streams are used to create different types of motiongrams, which, again, are used as the source material for a sonification process based on inverse FFT. The instrument is intuitive to play, allowing the performer to create sound by ``touching’’ a virtual sound wall.


Torresen, J., Hafting, Y., and Nymoen, K. (2013). A new wi-fi based platform for wireless sensor data collection. In Proceedings of the International Conference on New Interfaces For Musical Expression, pages 337–340, Daejeon, Korea.

A custom designed WLAN (Wireless Local Area Network) based sensor interface is presented in this paper. It is aimed at wirelessly interfacing a large variety of sensors to supplement built-in sensors in smart phones and media players. The target application area is collection of human related motions and condition to be applied in musical applications. The interface is based on commercially available units and allows for up to nine sensors. The benefit of using WLAN based communication is high data rate with low latency. Our experiments show that the average transmission time is less than 2ms for a single sensor. Further, it is operational for a whole day without battery recharging.

Published May 27, 2013 9:18 AM - Last modified Feb. 13, 2019 9:53 AM