The Faculty of Humanities (map)
Niels Henrik Abels vei 36
Abstract: In our own and other research on music-related actions, findings suggest that perceived action and sound are broken down into a series of chunks in people's minds when they perceive or imagine music. Chunks are here understood as holistically conceived and perceived fragments of action and sound, typically with durations in the 0.5 to 5 seconds range. There is also evidence suggesting the occurrence of coarticulation within these chunks, meaning the fusion of small-scale actions and sounds into more superordinate actions and sounds. Various aspects of chunking and coarticulation are discussed in view of their role in the production and perception of music, and it is suggested that coarticulation is an integral element of music and should be more extensively explored in the future.
Cynthia M. Grund, Network Coordinator for NNIMIPA, has posted a page with pictures and videos of the motion capture session done with American pianist William Westney during his visit to Oslo in February. There are also links to a video recording of a small discussion between Cynthia Grundt, William Westney and Alexander Refsum Jensenius on some of the topics discussed during the NNIMIPA workshop in Oslo.
Postdoctoral researcher Kyrre Glette participated in (and won!) the 64kB intro competition at the Assembly computer festival in Helsinki. A 64kB intro is an executable program in 64kB which includes realtime generation of graphics and music.
The animation includes a dancing robot, where the motion is based on data recorded with our new Qualisys infrared motion capture system.
Graphics programming done by Kim Kalland, Thomas Kristensen and Kyrre Glette. Sound programming and music by Gergely Szelei-Kis.
Our new motion capture system is presented in the Qualisys newsletter from May. We have been working with Qualisys to create an integrated solution for handling recording and streaming of music-related body movement data, and look forward to working with the new system in the coming years!
fourMs-researchers wil perform at the VERDIKT conference:
iPhone ensemble playing Bloom and Scrambler (for iPhone and small speakers). Featuring Alexander Refsum Jensenius, Kristian Nymoen, Anders Tveit, Arve Voldsund and Viet Phi Uy Hoang.
Dance Jockey by Yago de Quay and Ståle Skogstad (using Xsens inertial motion capture)
fourMs-researchers wil participate in the Department of Musicology's semester opening concert.
Kristian Nymoen, Anders Tveit, Alexander Refsum Jensenius: Bloom and Scrambler (for iPhone and small speakers)
Yago de Quay, Ståle Skogstad: Posture (with Xsens motion capture)
FourMs will host an international workshop on motion capture in music 12-16 October 2009, with guests from McGill University and University of Jyväskylä. The workshop is informal and open for anyone interested in the topic.
In the Dance Jockey project, we have used a full body inertial motion capture system, the Xsens MVN suit, for musical interaction. The name Dance Jockey is a word play on the well-known term Disc Jockey, or DJ. This name is meant to reflect that instead of using discs to perform music, we use dance or full body motion as the basis for the performance.
New publication by Kristian Nymoen, Rolf Inge Godøy, Jim Torresen, and Alexander Refsum Jensenius.
This week Victoria Johnson and Trond Lossius visit fourMs for a workshop on violin motion capture and sound spatialisation. We will test the commercially available K-bow with IRCAM's Gesture follower, Mubu, and the recently released Machine Learning toolkit for Max, and will also test the Jamoma sound spatialisation modules on the 36.2 speaker rig.
Researchers from fourMs will hold a motion capture workshop during this year's Art.on.Wires. This is a chance to work with a full-body inertial motion capture system (Xsens MVN). More information at art-on-wires.org .