Video Presentation
Specialisms/Tutoring
- Music Theory and Composition
- Musical Cognition
- Musical Analysis
- Music Technology
Current Research Activities
My main interest is in phenomenological approaches to music theory, meaning taking subjective impressions of musical sound as the point of departure for music theory, exploring the content of these subjective images of musical sound, and then correlating these images with the acoustic features of musical sound. I have thus previously been focusing mostly on mental images of musical sound, including work on Musical Imagery, but in the last decades, also on exploring music-related body motion. This led me to directing the Musical Gestures Project (2004-2007) and the Sensing Music-Related Actions Project (2008-2012), as well as participating in several topically related European research projects, projects using various conceptual and technological tools to explore the relationships between sound and body motion in the experience of music. My involvement in these projects has resulted in a number of publications within the area of music-related body motion, including a book edited with my Belgian colleague Marc Leman on musical gestures (Godøy and Leman 2010). I continued my work on music-related body motion in the interdisciplinary fourMs Group that I directed at the University of Oslo from 2013 to 2016, work that I now continue in the RITMO Centre of Excellence.
Tags:
Music Theory,
Music Cognition,
Music and Motion,
Music Technology
Publications
-
Godøy, Rolf Inge (2018). Motor constraints shaping musical experence. Music Theory Online.
ISSN 1067-3040.
24(3)
Show summary
In recent decades, we have seen a surge in published work on embodied music cognition, and it is now broadly accepted that musical experience is intimately linked with experiences of body motion. It is also clear that music performance is not something abstract and without restrictions, but something traditionally (i.e., before the advent of electronic music) constrained by our possibilities for body motion. The focus of this paper is on these various constraints of sound-producing body motion that shape the emergent perceptual features of musical sound, as well as on how these constraints may enhance our understanding of agency in music perception.
-
Godøy, Rolf Inge (2018). Motormimetic features in musical experience, In Patrizia Veroli & Gianfranco Vinay (ed.),
Music-dance: sound and motion in contemporary discourse.
Routledge.
ISBN 9781138280519.
12.
s 207
- 221
Show summary
There can be no doubt that we often experience correspondences between different sense modalities in music, such as between sound, vision, motion, and touch (just to mention the most prominent ones). This is evident in dance and other kinds of music-related body motion, and also reflected in listeners' innumerable accounts of visual associations with music, and in the ubiquitous use of visual metaphors for musical sound such as "rough", "smooth", "narrow", "broad", etc. In short, it should not be controversial to say that music is a multimodal form of art, that music involves a number of sensations in addition to pure sound. But more precisely how different sense modalities are activated, and how they interact in musical experience, still presents us with a number of unanswered questions. In our research, we have been pursuing the idea of what we call "motormimetic cognition", meaning an incessant mental simulation of sound-related body motion in music perception, primarily of assumed sound-producing body motion (e.g. hitting, stroking, bowing, blowing), but also of various kinds of sound-accompanying body motion (e.g. dancing, walking, gesticulating). We regard such mental simulation of body motion as applicable to most features of music, and we believe motormimetic cognition also can translate between different modalities, for instance that mental simulation of motion can translate between sound and visual images in musical experience. We thus believe motormimetic cognition can be the basis for a systematic research effort on feature mapping between different modalities in music, by way of studying both musicians' and listeners' body motion trajectory shapes and/or posture shapes in music-related contexts. In this chapter, we shall present basic principles of motormimetic cognition and demonstrate how it is relevant for work with new technologies and in multimedia art. Starting out with a review of timescales in our experiences of sound and motion, we shall present a rudimentary taxonomy of sound-motion categories, proceeding in a top-down manner from some global, salient features down to a fairly large number of detail features. This should enable us to have a versatile and flexible conceptual apparatus, as well as a collection of practical tools (using available technologies for sound and motion research), that will allow us to explore our aesthetic and affective images of sound and motion in music, hopefully also contributing to bridging the gaps between quantitative and qualitative approaches in research.
-
Godøy, Rolf Inge (2018). Sonic Object Cognition, In Rolf Bader (ed.),
Springer Handbook of Systematic Musicology.
Springer Nature.
ISBN 978-3-662-55002-1.
35.
s 761
- 777
Show summary
We evidently have features at different timescales in music, ranging from the sub-millisecond timescale of single vibrations to the timescale of a couple of hundred milliseconds, manifesting perceptually salient features such as pitch, loudness, timbre , and various transients. At the larger timescales of several hundred milliseconds, we have features such as the overall dynamic and timbral envelopes of sonic events, and at slightly larger timescales, also of various rhythmic, textural, melodic, and harmonic patterns. And at still larger timescales, we have phrases, sections, and whole works of music, often lasting several minutes, and in some cases, even hours. Features at these different timescales all contribute to our experience of music, however the focus in the present chapter is on the salient features of what has been called sonic objects, meaning on holistically perceived chunks of musical sound in the very approximately 0.5 − 5 s duration range. A number of known constraints in the production and perception of musical sound as well as in human behavior and perception in general, seem to converge in designating this timescale as crucial for our experience of music. The aim of this chapter is then to try to understand how sequentially unfolding and ephemeral sound and sound-related body motion can somehow be transformed in our minds to sonic objects.
-
Godøy, Rolf Inge (2018). Sound-Motion Bonding in Body and Mind, In Youn Kim & Sander L. Gilman (ed.),
The Oxford Handbook of Music and the Body.
Oxford University Press.
ISBN 9780190636234.
Chapter.
Show summary
This chapter focuses on the links between sound and body motion in music. It can readily be observed that musical sound is produced by body motion and also triggers body motion in many contexts, meaning scholars have an inexhaustible supply of sound-motion bonding available for research. The main challenges here are to get an overview of the different kinds of sound-motion bonding at work in music, and to go deeper into the subjective experiences of sound-motion bonding. To this end, the chapter presents sound-motion bonding in a so-called motor theory perspective on perception, suggesting that whatever humans perceive of sound, motion, and/or visual features is spontaneously re-enacted in our minds, meaning active mental simulation of whatever it is that we are perceiving. This leads to the idea of sound-motion objects, entities that fuse sensations of sound and motion into salient and holistically perceived units in musical experience.
-
Gonzalez Sanchez, Victor Evaristo; Dahl, Sofia; Hatfield, Johannes Lunde & Godøy, Rolf Inge (2018). Characterizing movement fluency in musical performance: Toward a generic measure for technology enhanced learning. Frontiers in Psychology.
ISSN 1664-1078.
. doi:
10.3389/fpsyg.2019.00084
Show summary
Virtuosity in music performance is often associated with fast, precise, and efficient sound-producing movements. The generation of such highly skilled movements involves complex joint and muscle control by the central nervous system, and depends on the ability to anticipate, segment, and coarticulate motor elements, all within the biomechanical constraints of the human body. When successful, such motor skill should lead to what we characterize as fluency in musical performance. Detecting typical features of fluency could be very useful for technology-enhanced learning systems, assisting and supporting students during their individual practice sessions by giving feedback and helping them to adopt sustainable movement patterns. In this study, we propose to assess fluency in musical performance as the ability to smoothly and efficiently coordinate while accurately performing slow, transitionary, and rapid movements. To this end, the movements of three cello players and three drummers at different levels of skill were recorded with an optical motion capture system, while a wireless electromyogrphy (EMG) system recorded the corresponding muscle activity from relevant landmarks. We analyze the kinematic and coarticulation characteristics of these recordings separately and then propose a combined model of fluency in musical performance predicting music sophistication. Results suggest movements from expert performers' are characterized by consistently smooth strokes and scaling of muscle phasic coactivation. The explored model of fluency as function of movement smoothness and coarticulation patterns was shown to be limited by the sample size but serves as a proof of concept. Results from this study show the potential of a technology-enhanced objective measure of fluency in musical performance, which could lead to improved practices for aspiring musicians, instructors, and researchers.
-
Godøy, Rolf Inge (2017). Key-postures, trajectories and sonic shapes, In Daniel Leech-Wilkinson & Helen M. Prior (ed.),
Music and Shape.
Oxford University Press.
ISBN 978-0199351411.
1.
s 4
- 29
Show summary
The focus of this chapter is on how our notions of shape in music emerge from experiences of sound-producing body motion such as hitting, stroking, bowing, shaking or blowing. Sound-producing body motion is seen as organized around postures at salient moments in the music, around so-called key-postures, and as making continuous trajectories between these key-postures. It is suggested that our experiences of making and/or seeing such key-postures and continuous trajectories in sound-producing body motion link the sonic and visual elements in music, meaning that body motion strongly contributes to our notions of shape in music.
-
Godøy, Rolf Inge (2017). Postures and motion shaping musical experience, In Micheline Lesaffre; Pieter-Jan Maes & Marc Leman (ed.),
The Routledge Companion to Embodied Music Interaction.
Routledge.
ISBN 9781138657403.
Chapter 12.
s 113
- 121
Show summary
Abstract: Sound-producing body motion and associated body postures shape musical sound in interaction with musical instruments or the vocal apparatus, making images of such body motion and postures integral to our experiences of music. In this chapter we shall look at some sound-producing body motion features, and focus on how postures at salient moments in time serve as landmarks for both motion and sound, as well as how the fusion of small-scale motion units and sounds by so-called coarticulation creates posture-centered sound-motion objects in musical experience.
-
Godøy, Rolf Inge; Song, Min-Ho & Dahl, Sofia (2017). Exploring Sound-Motion Textures in Drum Set Performance. Proceedings of the SMC Conferences.
ISSN 2518-3672.
s 145- 152
Show summary
A musical texture, be that of an ensemble or of a solo instrumentalist, may be perceived as combinations of both simultaneous and sequential sound events. However, we believe that also sensations of the corresponding sound-producing events (e.g. hitting, stroking, bowing, blowing) contribute to our perceptions of musical textures. Musical textures could thus be understood as multimodal, with features of both sound and motion, hence the idea here of sound-motion textures in music. The study of such multimodal sound-motion textures will necessitate collecting and analyzing data of both the produced sound and of the sound-producing body motion, thus entailing a number of methodological challenges. In our current work on sound-motion textures in music, we focus on short and idiomatic figures for different instruments (e.g. ornaments on various instruments), and in this paper, we present some ideas, challenges, and findings on typical sound-motion textures in drum set performance. Drum set performance is particularly interesting because the often very complex textures are produced by one single performer, entailing a number of issues of human motion and motor control.
-
Godøy, Rolf Inge (2016). Timescales in Musical Experience, In Sven Hroar Klempe (ed.),
Cultural Psychology of Musical Experience.
Information Age Publishing.
ISBN 978-1681234847.
Chapter 9.
s 165
- 178
-
Godøy, Rolf Inge; Song, Min-Ho; Nymoen, Kristian; Haugen, Mari Romarheim & Jensenius, Alexander Refsum (2016). Exploring Sound-Motion Similarity in Musical Experience. Journal of New Music Research.
ISSN 0929-8215.
45(3), s 210- 222 . doi:
10.1080/09298215.2016.1184689
Show summary
People tend to perceive many and also salient similarities between musical sound and body motion in musical experience, as can be seen in countless situations of music performance or listening to music, and as has been documented by a number of studies in the past couple of decades. The so-called motor theory of perception has claimed that these similarity relationships are deeply rooted in human cognitive faculties, and that people perceive and make sense of what they hear by mentally simulating the body motion thought to be involved in the making of sound. In this paper, we survey some basic theories of sound-motion similarity in music, and in particular the motor theory perspective. We also present findings regarding sound-motion similarity in musical performance, in dance, in so-called sound-tracing (the spontaneous body motions people produce in tandem with musical sound), and in sonification, all in view of providing a broad basis for understanding sound-motion similarity in music.
-
Song, Min-Ho & Godøy, Rolf Inge (2016). How fast is your body motion? Determining a sufficient frame rate for an optical motion tracking system using passive markers. PLoS ONE.
ISSN 1932-6203.
11(3) . doi:
10.1371/journal.pone.0150993
Full text in Research Archive.
-
Godøy, Rolf Inge (2015). Ubiquitous Motor Cognition in Musical Experience: Open Peer Review of Jacques Launay’s “Musical Sounds, Motor Resonance, and Detectable Agency”. Empirical Musicology Review.
ISSN 1559-5749.
10(1), s 41- 45
Show summary
Motor cognition, defined as the capacity to conceive, plan, control, perceive, and imagine body motion, is here seen as a ubiquitous element in music: music is produced by body motion, people often move in various ways when listening to music, and images of body motion seem to be integral to mental images of musical sound. Given this ubiquity of motor cognition in musical experience, it could be argued that motor cognition is a fundamental element in music, and thus could be hypothesized to also have been an essential element in the evolution of music, regardless of whether music is seen as primarily a social or as a more solitary phenomenon. It could furthermore be argued that music in all cases has intersubjective significance because of shared motor cognition among people, and also that this motor cognition may be applied to most perceptually salient features of music.
-
Godøy, Rolf Inge (2014). Ecological Constraints of Timescales, Production, and Perception in Temporal Experiences of Music: A Commentary on Kon (2014). Empirical Musicology Review.
ISSN 1559-5749.
Show summary
In trying to structure our discussions of temporal experience in music, it could be useful to look at some basic ecological constraints of timescales, produc¬tion, and perception of music. This may hopefully help us to distinguish between on the one hand readily perceived features of sound and music-related body motion, i.e. con¬crete sonic, kinematic, and proprioceptive features, and on the other hand, more generic, amodal, and abstract elements in musical discourse, manifest in various symbolic representa¬tions such as notation, numbers, and diagrams. Given easily accessible music tech¬nologies, it is possible to experiment with different editions of musical works, i.e. concatenate fragments in a different order and then evaluate the emergent contex¬tual effects in listening experiments. Also, given the faculties of musical imagery (de¬fined as our ability to mentally re-experience musical sound and body motion in the ab¬sence of physically present sound and body motion), we can at will recombine chunks of music in our minds and mentally scan through large musical works. The contention here is that such recombination in actual re-editing of musical sound or in musical im¬agery will still be related to the basic ecological constraints of the timescales, production, and perception in music.
-
Godøy, Rolf Inge (2014). Understanding Coarticulation in Musical Experience, In Mitsuko Aramaki; Olivier Derrien; Richard Kronland-Martinet & Sølvi Ystad (ed.),
Sound, Music, and Motion.
Springer Publishing Company.
ISBN 978-3-319-12975-4.
Kapittel.
s 535
- 547
Show summary
The term coarticulation designates the fusion of small-scale events, such as single sounds and single sound-producing actions, into larger units of combined sound and body motion, resulting in qualitative new features at what we call the chunk timescale in music, typically in the 0.5.–5 s duration range. Coarticulation has been extensively studied in linguistics and to a certain extent in other domains of human body motion as well as in robotics, but so far not so much in music, so the main aim of this paper is to provide a background for how we can explore coarticulation in both the production and perception of music. The contention is that coarticulation in music should be understood as based on a number of physical, biomechanical and cognitive constraints, and that coar- ticulation is an essential factor in the shaping of several perceptually salient features of music.
-
Haugen, Mari Romarheim & Godøy, Rolf Inge (2014). Rhythmical Structures in Music and Body Movement in Samba Performance, In Moo Kyoung Song (ed.),
Proceedings of the ICMPC-APSCOM 2014 Joint Conference: 13th Biennial International Conference for Music Perception and Cognition and 5th Triennial Conference of the Asia Pacific Society for the Cognitive Sciences of Music.
College of Music, Yonsei University.
ISBN 978-89-89544-53-1.
Article in Proceedings.
s 46
- 52
Full text in Research Archive.
-
Godøy, Rolf Inge (2013). Quantal Elements in Musical Experience, In Rolf Bader (ed.),
Sound - Perception - Performance.
Springer.
ISBN 9783319001067.
4.
s 113
- 128
Show summary
The aim of this paper is to present a model for understanding unit formation, what we prefer to call chunking, at short-term timescales in musical experience, typically in the du¬ration range of approximately 0,5 to 5 seconds. The idea is that at these short-term timescales, chunks of sound and associated body motion are conceived and perceived holistically, hence demonstrate what may be called quantal elements in musi¬cal expe¬ri-ence. Very many salient musical features for identifying style, motion, and af¬fect, can be found at such short-term timescales. Also, there are several constraints in perception, cognition and body motion that converge on suggesting quantal elements at work in music. A better understanding of such quan¬tal elements in musical experience could be useful in the fields of music percep¬tion, mu¬sic analysis, and music information retrieval, as well as in var¬ious practi¬cal artistic and educational contexts.
-
Godøy, Rolf Inge (2013). Shape Cognition and Temporal, Instrumental and Cognitive Constraints on Tonality. Public Peer Review of “Tonality: The Shape of Affect” by Mine Doğantan-Dack. Empirical Musicology Review.
ISSN 1559-5749.
8(3-4), s 223- 226
-
Godøy, Rolf Inge (2013). Thinking Sound and Body-Motion Shapes in Music: Public Peer Review of “Gesture and the Sonic Event in Karnatak Music” by Lara Pearson. Empirical Musicology Review.
ISSN 1559-5749.
8(1), s 15- 18
-
Jensenius, Alexander Refsum & Godøy, Rolf Inge (2013). Sonifying the shape of human body motion using motiongrams. Empirical Musicology Review.
ISSN 1559-5749.
8(2), s 73- 83 Full text in Research Archive.
Show summary
The paper presents sonomotiongram, a technique for the creation of auditory displays of human body motion based on motiongrams. A motiongram is a visual display of motion, based on frame differencing and reduction of a regular video recording. The resultant motiongram shows the spatial shape of the motion as it unfolds in time, somewhat similar to the way in which spectrograms visualise the shape of (musical) sound. The visual similarity of motiongrams and spectrograms is the conceptual starting point for the sonomotiongram technique, which explores how motiongrams can be turned into sound using “inverse FFT”. The paper presents the idea of shape-sonification, gives an overview of the sonomotiongram technique, and discusses sonification examples of both simple and complex human motion.
-
Nymoen, Kristian; Godøy, Rolf Inge; Jensenius, Alexander Refsum & Tørresen, Jim (2013). Analyzing Correspondence between Sound Objects and Body Motion. ACM Transactions on Applied Perception.
ISSN 1544-3558.
10(2) . doi:
10.1145/2465780.2465783
Full text in Research Archive.
Show summary
Links between music and body motion can be studied through experiments called sound-tracing. One of the main challenges in such research is to develop robust analysis techniques that are able to deal with the multidimensional data that musical sound and body motion present. The article evaluates four different analysis methods applied to an experiment in which participants moved their hands following perceptual features of short sound objects. Motion capture data has been analyzed and correlated with a set of quantitative sound features using four different methods: (a) a pattern recognition classifier, (b) t-tests, (c) Spearman’s ρ correlation, and (d) canonical correlation. This article shows how the analysis methods complement each other, and that applying several analysis techniques to the same data set can broaden the knowledge gained from the experiment.
-
Godøy, Rolf Inge; Jensenius, Alexander Refsum; Voldsund, Arve; Glette, Kyrre Harald; Høvin, Mats Erling; Nymoen, Kristian; Skogstad, Ståle Andreas van Dorp & Tørresen, Jim (2012). Classifying Music-Related Actions, In Emilios Cambouropoulos; Costas Tsougras; Panayotis Mavromatis & Konstantinos Pastiadis (ed.),
Proceedings of the ICMPC-ESCOM 2012 Joint Conference: 12th Biennial International Conference for Music Perception and Cognition, 8th Triennial Conference of the European Society for the Cognitive Sciences of Music.
School of Music Studies, Aristotle University of Thessaloniki Thessaloniki, Hellas.
ISBN 978-960-99845-1-5.
Artikkel i Proceedings.
s 352
- 357
Full text in Research Archive.
Show summary
Our research on music-related actions is based on the conviction that sensations of both sound and body motion are inseparable in the production and perception of music. The expression "musicrelated actions" is here used to refer to chunks of combined sound and body motion, typically in the duration range of approximately 0.5 to 5 seconds. We believe that chunk-level music-related actions are highly significant for the experience of music, and we are presently working on establishing a database of music-related actions in order to facilitate access to, and research on, our fast growing collection of motion capture data and related material. In this work, we are confronted with a number of perceptual, conceptual and technological issues regarding classification of music-related actions, issues that will be presented and discussed in this paper.
-
Kozak, Mariusz; Nymoen, Kristian & Godøy, Rolf Inge (2012). The Effects of Spectral Features of Sound on Gesture Type and Timing, In Eleni Efthimiou; Georgios Kouroupetroglou & Stavroula-Evita Fotinea (ed.),
Gesture and Sign Language in Human-Computer Interaction and Embodied Communication.
Springer.
ISBN 978-3-642-34181-6.
KAPITTEL.
s 69
- 80
-
Nymoen, Kristian; Tørresen, Jim; Godøy, Rolf Inge & Jensenius, Alexander Refsum (2012). A Statistical Approach to Analyzing Sound Tracings, In Sølvi Ystad; Mitsuko Aramaki; Richard Kronland-Martinet; Kristoffer Jensen & Sanghamitra Mohanty (ed.),
Speech, Sound and Music Processing: Embracing Research in India. 8th International Symposium, CMMR 2011. 20th International Symposium, FRSM 2011.
Springer.
ISBN 978-3-642-31979-2.
Kapittel i bok.
s 120
- 145
Full text in Research Archive.
Show summary
This paper presents an experiment on sound tracing, meaning an experiment on how people relate motion to sound. 38 participants were presented with 18 short sounds, and instructed to move their hands in the air while acting as though the sound was created by their hand motion. The hand motion of the participants was recorded, and has been analyzed using statistical tests, comparing results between different sounds, between different subjects, and between different sound classes. We have identified several relationships between sound and motion which are present in the majority of the subjects. A clear distinction was found in onset acceleration for motion to sounds with an impulsive dynamic envelope compared to non-impulsive sounds. Furthermore, vertical movement has been shown to be related to sound frequency, both in terms of spectral centroid and pitch. Moreover, a significantly higher amount of overall acceleration was observed for non-pitched sounds as compared to pitched sounds.
-
Godøy, Rolf Inge (2011). Coarticulated gestural-sonorous objects in music, In Anthony Gritten & Elaine King (ed.),
New perspectives on music and gesture.
Ashgate.
ISBN 978-0-7546-6462-8.
3.
s 67
- 82
-
Godøy, Rolf Inge (2011). Sound-action awareness in music, In David Ian Clarke & Eric F. Clarke (ed.),
Music and consciousness: philosophical, psychological, and cultural perspectives.
Oxford University Press.
ISBN 978-0-19-955379-2.
13.
s 231
- 243
-
Godøy, Rolf Inge (2011). Sound-action chunks in music. Springer Tracts in Advanced Robotics.
ISSN 1610-7438.
s 13- 26
-
Glette, Kyrre Harald; Jensenius, Alexander Refsum & Godøy, Rolf Inge (2010). Extracting action-sound features from a sound-tracing study, In Sule Yildirim & Anders Kofod-Petersen (ed.),
Proceedings of the second Norwegian Artificial Intelligence Symposium : November 22, 2010 Høgskolen i Gjøvik.
Tapir Akademisk Forlag.
ISBN 978-82-519-2704-8.
KAPITTEL.
s 63
- 66
Full text in Research Archive.
Show summary
The paper addresses possibilities of extracting information from music-related actions, in the particular case of what we call sound-tracings. These tracings are recordings from a graphics tablet of subjects' drawings associated with a set of short sounds. Although the subjects' associations to sounds are very subjective, and thus the resulting tracings are very different, an attempt is made at extracting some global features which can be used for comparison between tracings. These features are then analyzed and classified with an SVM classifier.
-
Godøy, Rolf Inge (2010). Gestural Affordances of Musical Sound, In Rolf Inge Godøy & Marc Leman (ed.),
Musical Gestures: Sound, Movement, and Meaning.
Routledge.
ISBN 978-0-415-99887-1.
Chapter 5.
s 103
- 125
-
Godøy, Rolf Inge (2010). Images of Sonic Objects. Organised Sound.
ISSN 1355-7718.
15(1), s 54- 62 . doi:
10.1017/S1355771809990264
Full text in Research Archive.
Show summary
Based on innumerable informal accounts and a number of scientific studies, there can be no doubt that people often have quite vivid images of musical sound in their minds, and that this is the case regardless of levels of musical training. Various introspective accounts and more recent neurocognitive research seem to converge in suggesting that imagery for music is closely linked with imagery for music-related actions. In this paper, the consequences of sound–action links for our notions of the sonic image are discussed, with a particular focus on the relationship between sonic objects and action chunks. In conclusion, the exploitation of action imagery is seen as holding great promise in enhancing our means for musical imagery in various creative, research and educational contexts.
-
Godøy, Rolf Inge (2010). Thinking Now-Points in Music-Related Movement, In Rolf Bader; Christiane Neuhaus & Ulrich Morgenstern (ed.),
Concepts, Experiments, and Fieldwork: Studies in Systematic Musicology and Ethnomusicology.
Peter Lang Publishing Group.
ISBN 978-3-631-58902-1.
Chapter 12.
s 241
- 258
-
Godøy, Rolf Inge; Jensenius, Alexander Refsum & Nymoen, Kristian (2010). Chunking in Music by Coarticulation. Acta Acustica united with Acustica.
ISSN 1610-1928.
96(4), s 690- 700 . doi:
10.3813/AAA.918323
Full text in Research Archive.
Show summary
In our own and other research on music-related actions, findings suggest that perceived action and sound are broken down into a series of chunks in people’s minds when they perceive or imagine music. Chunks are here understood as holistically conceived and perceived fragments of action and sound, typically with durations in the 0.5 to 5 seconds range. There is also evidence suggesting the occurrence of coarticulation within these chunks, meaning the fusion of small-scale actions and sounds into more superordinate actions and sounds. Various aspects of chunking and coarticulation are discussed in view of their role in the production and perception of music, and it is suggested that coarticulation is an integral element of music and should be more extensively explored in the future.
-
Halmrast, Tor; Guettler, Knut; Bader, Rolf & Godøy, Rolf Inge (2010). Gesture and Timbre, In Rolf Inge Godøy & Marc Leman (ed.),
Musical Gestures: Sound, Movement, and Meaning.
Routledge.
ISBN 978-0-415-99887-1.
Chapter 8.
s 183
- 211
-
Jensenius, Alexander Refsum; Glette, Kyrre Harald; Godøy, Rolf Inge; Høvin, Mats Erling; Nymoen, Kristian; Skogstad, Ståle Andreas van Dorp & Tørresen, Jim (2010). fourMs, University of Oslo – Lab Report, In Robert Rowe & Dimitris Samaras (ed.),
Proceedings of the International Computer Music Conference, June 1-5 2010, New York.
International Computer Music Association.
ISBN 0-9713192-8-6.
KAPITTEL.
s 290
- 293
Full text in Research Archive.
Show summary
The paper reports on the development and activities in the recently established fourMs lab (Music, Mind, Motion, Machines) at the University of Oslo, Norway. As a meeting place for researchers in music and informatics, the fourMs lab is centred around studies of basic issues in music cognition, machine learning and robotics.
-
Jensenius, Alexander Refsum; Wanderley, Marcelo M.; Godøy, Rolf Inge & Leman, Marc (2010). Musical Gestures: concepts and methods in research, In Rolf Inge Godøy & Marc Leman (ed.),
Musical Gestures: Sound, Movement, and Meaning.
Routledge.
ISBN 978-0-415-99887-1.
2.
s 12
- 35
Full text in Research Archive.
Show summary
This chapter starts with a review of some current definitions of "gesture". The second part presents a conceptual framework for differentiating various functional aspects of gestures in music performance. The third part presents a brief overview of some methodological approaches that can be used in gesture research.
-
Leman, Marc & Godøy, Rolf Inge (2010). Why Study Musical Gestures?, In Rolf Inge Godøy & Marc Leman (ed.),
Musical Gestures: Sound, Movement, and Meaning.
Routledge.
ISBN 978-0-415-99887-1.
Chapter 1.
s 3
- 11
-
Godøy, Rolf Inge (2009). Chunking Sound for Musical Analysis. Lecture Notes in Computer Science.
ISSN 0302-9743.
5493, s 67- 80
Show summary
One intriguing issue in music analysis is that of segmentation, or parsing, of continuous auditory streams into some kinds of meaningful and analytically convenient units, a process that can be denoted as chunking. The purpose of this paper is to present a theory of chunking in musical analysis based on perceptual features of sound and on our own research on musical gestures, suggesting that music-related actions are essential in the process of chunking.
-
Godøy, Rolf Inge (2009). Geometry and Effort in Gestural Renderings of Musical Sound. Lecture Notes in Computer Science.
ISSN 0302-9743.
5084, s 205- 215 Full text in Research Archive.
Show summary
Abstract. As may be seen at concerts and in various everyday listening situa- tions, people often make spontaneous gestures when listening to music. We believe these gestures are interesting to study because they may reveal important features of musical experience. In particular, hand movements may give us information on what features are perceived as salient by listeners. Based on various current ideas on embodied cognition, the aim of this paper is to argue that gestures are integral to music perception, and to present research in support of this. A conceptual model of separating geometry and effort is presented in order to better understand the variety of music-related gestures we may observe, lead- ing up to some ideas on how to apply this conceptual model in present and future research.
-
Godøy, Rolf Inge (2009). Music Theory by Sonic Objects, In Évelyne Gayou (ed.),
Pierre Schaeffer. Polychrome Portraits.
Institut national de l'audiovisuel.
ISBN 9782869382107.
Kapittel.
s 67
- 77
-
Godøy, Rolf Inge & Jensenius, Alexander Refsum (2009). Body Movement in Music Information Retrieval, In Keiji Hirata; George Tzanetakis & Kazuyoshi Yoshii (ed.),
ISMIR 2009 - Proceedings of the 10th International Society for Music Information Retrieval Conference.
International Society for Music Information Retrieval.
ISBN 978-0-9813537-0-8.
KAPITTEL.
s 45
- 50
Full text in Research Archive.
Show summary
We can see many and strong links between music and human body movement in musical performance, in dance, and in the variety of movements that people make in listening situations. There is evidence that sensations of human body movement are integral to music as such, and that sensations of movement are efficient carriers of information about style, genre, expression, and emotions. The challenge now in MIR is to develop means for the extraction and representation of movement-inducing cues from musical sound, as well as to develop possibilities for using body movement as input to search and navigation interfaces in MIR. contexts.
-
Godøy, Rolf Inge (2008). Pour une théorie musicale fondée sur l'objet sonore, I: Évelyne Gayou (red.),
Pierre Schaeffer. Portraits polychromes.
Institut national de l'audiovisuel.
ISBN 9782869382091.
Kapittel.
s 67
- 75
-
Godøy, Rolf Inge (2008). Reflections on Chunking in Music, In Albrecht Schneider (ed.),
Systematic and Comparative Musicology: Concepts, Methods, Findings.
Peter Lang Publishing Group.
ISBN 9783631579534.
Kapittel.
s 117
- 133
-
Godøy, Rolf Inge; Jensenius, Alexander Refsum & Nymoen, Kristian (2008). Production and perception of goal-points and coarticulations in music. Journal of the Acoustical Society of America.
ISSN 0001-4966.
123(5), s 3657- 3657 . doi:
10.1121/1.2934964
Show summary
From our studies of sound-related movement (http:-slash-slash musicalgestures.uio.no), we have reason to believe that both sound-producing and sound-accompanying movements are centered around what we call goal-points, meaning certain salient events in the music such as downbeats, or various accent types, or melodic peaks. In music performance, these goal-points are reflected in the positions and shapes of the performers' effectors (fingers, hands, arms, torso, etc.) at certain moments in time, similar to what is known as keyframes in animation. The movement trajectories between these goal-points, similar to what is known as interframes in animation, may often demonstrate the phenomenon of coarticulation, i.e. that the various smaller movement are subsumed under more superordinate and goal-directed movement trajectories. In this paper, we shall present a summary of recent human movement research in support of this scheme of goal-points and coarticulations, as well as demonstrate this scheme with data from our ongoing motion capture studies of pianists' performance and other researchers' motion capture data. ©2008 Acoustical Society of America
-
Jensenius, Alexander Refsum; Nymoen, Kristian & Godøy, Rolf Inge (2008). A Multilayered GDIF-Based Setup for Studying Coarticulation in the Movements of Musicians, In Michael Alcorn (ed.),
Proceedings of the International Computer Music Conference, 2008, Belfast.
International Computer Music Association.
ISBN 0-9713192-6-X.
KAPITTEL.
s 743
- 746
Show summary
The paper presents some challenges faced in developing an experimental setup for studying coarticulation in music-related body movements. This has included solutions for storing and synchronising motion capture, biosensor and MIDI data, and related audio and video files. The implementation is based on a multilayered Gesture Description Interchange Format (GDIF) structure, written to Sound Description Interchange Format (SDIF) files using the graphical programming environment Max/MSP.
-
Godøy, Rolf Inge (2006). Coarticulated gestural-sonorous objects in music, In Anthony Gritten & Elaine King (ed.),
Proceedings of the Second International Conference on Music and Gesture. 20–23 July 2006 Royal Northern College of Music Manchester, UK.
GK Publishing.
ISBN 0-9553329-0-7.
Poster.
s 33
- 34
-
Godøy, Rolf Inge (2006). Gestural-Sonorous Objects: embodied extensions of Schaeffer’s conceptual apparatus. Organised Sound.
ISSN 1355-7718.
11(2), s 149- 157 . doi:
10.1017/S1355771806001439
Full text in Research Archive.
-
Godøy, Rolf Inge (2006). Motor-mimetic images of musical sound, In Mario Baroni; Anna Rita Adessi; Roberto Caterina & Marco Costa (ed.),
9th International Conference on Music Perception and Cognition. Abstracts.
Bononia University Press.
ISBN 8873951554.
Abstract.
-
Godøy, Rolf Inge; Haga, Egil & Jensenius, Alexander Refsum (2006). Exploring Music-Related Gestures by Sound-Tracing: A Preliminary Study, In Kia Ng (ed.),
Proceedings of the COST287-ConGAS 2nd International Symposium on Gesture Interfaces for Multimedia Systems (GIMS2006).
www.iscrim.org.uk/gims.
ISBN 0853162468.
Proceeding.
s 27
- 33
Full text in Research Archive.
Show summary
This is an exploration of listeners association of gestures with musical sounds. The subjects listen to sounds that have been chosen for various salient features, and the tracing movements made by the subjects are recorded and subsequently compared in view of common features in the tracings.
-
Godøy, Rolf Inge; Haga, Egil & Jensenius, Alexander Refsum (2006). Playing "Air instruments": Mimicry of sound-producing gestures by novices and experts. Lecture Notes in Computer Science.
ISSN 0302-9743.
3881 Full text in Research Archive.
Show summary
Both musicians and non-musicians can often be seen making sound-producing gestures in the air without touching any real instruments. Such ”air playing” can be regarded as an expression of how people perceive and imagine music, and studying the relationships between these gestures and sound might contribute to our knowledge of how gestures help structure our experience of music.
-
Jensenius, Alexander Refsum; Godøy, Rolf Inge & Kvifte, Tellef (2006). Towards a gesture description interchange format, In Norbert Schnell; Frédéric Bevilacqua; Michael Lyons & Atau Tanaka (ed.),
6th International Conference on New Interfaces for Musical Expression.
IRCAM - Centre Pompidou.
ISBN 9782844263148.
KAPITTEL.
s 176
- 179
Full text in Research Archive.
Show summary
This paper presents our need for a Gesture Description Interchange Format (GDIF) for storing, retrieving and sharing information about music-related gestures. Ideally, it should be possible to store all sorts of data from various commercial and custom made controllers, motion capture and computer vision systems, as well as results from different types of gesture analysis, in a coherent and consistent way. This would make it possible to use the information with different software, platforms and devices, and also allow for sharing data between research institutions. We present some of the data types that should be included, and discuss issues which need to be resolved.
-
Jensenius, Alexander Refsum; Godøy, Rolf Inge & Wanderley, Marcelo M. (2005). Developing Tools for Studying Musical Gestures within the Max/MSP/Jitter Environment, In Andres Lewin-Richter & Xavier Serra (ed.),
Proceedings of the International Computer Music Conference.
International Computer Music Association.
Paper.
s 282
- 285
Full text in Research Archive.
Show summary
We present the Musical Gestures Toolbox, a collection of Max/MSP/Jitter modules to help in qualitative and quantitative analysis of musical gestures. Examples are shown of how the toolbox is used for studying musical mimicry, such as ''air piano'' performance, and expressive gestures of musicians.
-
Godøy, Rolf Inge (2004). Gestural Imagery in the Service of Musical Imagery, In Antonio Camurri & Gualtiero Volpe (ed.),
Gesture-Based Communication in Human-Computer Interaction: 5th International Gesture Workshop, GW 2003, Genova, Italy, April 15-17, 2003, Selected Revised Papers, LNAI 2915.
Springer.
kapittel.
s 55
- 62
-
Godøy, Rolf Inge (2003). Motor-mimetic music cognition. Leonardo: Journal of the International Society for the Arts, Sciences and Technology.
ISSN 0024-094X.
36, s 317- 319
-
Godøy, Rolf Inge (2001). Imagined Action, Excitation, and Resonance, In
Musical Imagery.
Swets & Zeitlinger, Lisse (Holland).
ISBN 90-265-1831-5.
kapittel.
s 237
- 250
Show summary
Music cognition is here seen as fundamentally cross-modal and as constrained by ecological factors, in particular by accumulated knowledge of sound-production. Patterns of imagined actions and patterns in the behaviour of resonating bodies can be a privileged path to evoking salient images of musical sound, as well as being integral to most images of musical sound in the first place.
-
Godøy, Rolf Inge (2001). Motor-mimetic music cognition, In Marcel Formosa (ed.),
Intersens et nouvelles technologies.
Laboratoire Musique et Informatique de Marseille (MIM), Marseille.
Show summary
Much research support the idea that sensations of motor activity, in particular of sound-production, are an essential in the perception and cognition of musical sound. This motor mimetic element is cross-modal, involving images of action and vision as well as of sound. http://www.labo-mim.org/pdf%20intersens/Godoy.pdf
-
Schneider, Albrecht & Godøy, Rolf Inge (2001). Perspectives and Challenges of Musical Imagery, In
Musical Imagery.
Swets & Zeitlinger, Lisse (Holland).
ISBN 90-265-1831-5.
1.
s 5
- 26
Show summary
This is a presentation of various notions of imagery encountered in western philosophy and psychology from antiquity to our own times, with particular emphasis on the phenomenological and gestalt theoretical insights of some nineteenth and twentieth century thinkers. Several such introspective insights seem to still have relevance today, even with the advent of various neurophysiological and experimental cognitive methods.
-
Godøy, Rolf Inge (1999). Crossmodality and conceptual shapes and spaces in music theory, I: Ioannis Zannos (red.),
Music and Signs.
ASCO Art and Science, Bratislava.
s 85
- 98
Show summary
Thinking of musical sounds as shapes, and furthermore as variable or deformable shapes in multidimensional spaces for the various features, seems to be a fruitful strategy in musical analysis and could also be useful in musical semiotics and musical aesthetics. The notion of shape is founded on a cross-modal understanding of music cognition in the sense that vision and action are believed to contribute strongly to the schematic ordering of sound. The notion of spaces is founded on the well established idea of analysis by synthesis as well as on the idea of geometric or visual representations of various emergent qualities in music.
View all works in Cristin
-
Jensenius, Alexander Refsum; Tveit, Anders; Godøy, Rolf Inge & Overholt, Dan (ed.) (2011). Proceedings of the International Conference on New Interfaces for Musical Expression.
Universitetet i Oslo.
ISBN 978-82-991841-6-8.
586 s.
Show summary
The International Conference on New Interfaces for Musical Expression (NIME) is an annual interdisciplinary conference gathering 200-500 participants from all over the world to share their knowledge and late-breaking work on new musical interface design. The NIME conference started out as a workshop at the Conference on Human Factors in Computing Systems (CHI) in 2001, and has grown into one of the largest and most vital international conferences within the field of music technology.
-
Godøy, Rolf Inge & Leman, Marc (ed.) (2010). Musical Gestures: Sound, Movement, and Meaning.
Routledge.
ISBN 978-0-415-99887-1.
320 s.
Show summary
We experience and understand the world, including music, through body movement–when we hear something, we are able to make sense of it by relating it to our body movements, or form an image in our minds of body movements. Musical Gestures is a collection of essays that explore the relationship between sound and movement. It takes an interdisciplinary approach to the fundamental issues of this subject, drawing on ideas, theories and methods from disciplines such as musicology, music perception, human movement science, cognitive psychology, and computer science.
-
Godøy, Rolf Inge & Jørgensen, Harald (2001). Musical Imagery.
Swets & Zeitlinger, Lisse (Holland).
ISBN 90-265-1831-5.
320 s.
Show summary
The book presents an edited collection of papers on musical imagery. Musical imagery can be defined as our mental capacity for imagining musical sound in the absence of a directly audible sound source, meaning that we can recall and re-experience or even invent new musical sound through our "inner ear". Musical imagery is integral to music cognition, and there can be no perception, cognition or knowledge of music unless we have images of musical sound in our minds. The papers in this book reflect a wide range of approaches to musical imagery.
View all works in Cristin
-
Godøy, Rolf Inge (2018). Impulse-driven sound-motion objects in musical imagery.
Show summary
Musical imagery can be defined as having experiences of music in our minds in the absence of any physically present sound. Experiences of 'tunes in the head' seem to be quite common among both musically trained and untrained people, but one of the pressing questions here is how such images are triggered, or what is the engine of musical imagery in our minds. One possible answer could be that mental images of sound-producing body motion may trigger images of sound in our minds, e.g. images of hitting motion triggering images of drum sound. In this presentation, the focus will be on motor elements of musical imagery, in particular on how fragments of combined sound and motion, what may be called sound-motion objects, are triggered by motion impulses in our minds.
-
Godøy, Rolf Inge & Song, Min-Ho (2017). Impulse-driven sound-motion objects.
Show summary
Our own and other research seems to suggest that perception and cognition of musical sound is closely linked with images of sound-producing body motion, and that chunks of sound are perceived as linked with chunks of sound-producing body motion, leading us to the concept of sound-motion objects in music (Godøy et al. 2016). One challenge in our research is trying to understand how such sound-motion objects actually emerge in music. Taking into account findings in motor control research as well as in our own research, we hypothesize that there is a so-called intermittent motor control scheme (Sakaguchi et al. 2015) at work in sound-producing body motion, meaning a discontinuous, point-by-point control scheme, resulting in a series of holistically conceived chunks of sound-producing motion, in turn resulting in the perception of music as concatenations of coherent sound-motion objects. References Godøy, R. I., Song, M-H., Nymoen, K., Romarheim, M. H., & Jensenius, A. R. (2016). Exploring Sound-Motion Similarity in Musical Experience . Journal of New Music Research, 45(3), 210-222. doi:10.1080/09298215.2016.1184689 Sakaguchi, Y., Tanaka, M., and Inoue, Y. (2015). Adaptive intermittent control: A computational model explaining motor intermittency observed in human behavior, Neural Networks, Volume 67, July 2015, Pages 92-109, ISSN 0893-6080, http://dx.doi.org/10.1016/j.neunet.2015.03.012.
-
Godøy, Rolf Inge (2016). Understanding the musical instant.
Show summary
Granted that we in musical experiences may have a range of feature durations from the short (in the area of a few hundred milliseconds) to the very long (that of several hours), the focus of this chapter is on the short range, on what we may subjectively perceive as the musical instant. This is based on the conviction that very many salient features, both in the perception and in the production of musical sound may be found at this timescale. We have had theories suggesting the importance of short fragments in musical experience, from Husserl's idea of perception by a series of 'now-points' to Schaeffer's theories of sonic objects, as well as systematic psychoacoustic studies of duration thresholds in sound perception and recent studies of duration thresholds for salient musical features by Gjerdingen and Perrot, by Krumhansl, and by Plazak and Huron. In parallel, research on human motor control has suggested that human motion is goal-directed and proceeds by what may be called key-postures at intermittent points of orientation such as at downbeats and other accents, surrounded by continuous motion, e.g. of the mallet/hand accelerating from the starting position to the impact with the drum membrane and bouncing back again to equilibrium. We believe there is a close relationship between sound-producing motion and perception, and that experiences of the musical instant are linked with biomechanical and motor control constraints involved in music. In particular, we think that motion acceleration peaks and ensuing impacts in performance are typical of salient moments in music. Understanding the musical instant as linked with various constraints of sound-producing motion could be useful for several domains of music-related research, e.g. in understanding chunking, beat extraction, and entrainment in musical experience.
-
Godøy, Rolf Inge; Song, Min-Ho; Nymoen, Kristian; Haugen, Mari Romarheim & Jensenius, Alexander Refsum (2016, 13. juli). ¿Por qué marcamos el ritmo de la música con los pies?. [Internett].
BBC Mundo.
Show summary
¿Alguna vez has estado en un bar o en un restaurante, sentado en la calle o en un salón cuando suena música y tú y otros empiezan a golpear el piso con el pie al ritmo de la música?
-
Godøy, Rolf Inge (2015). Chunking by intermittent motor control in music.
Show summary
Various theories in music perception and cognition, from classical gestalt theory to more recent experimental work and data-driven modeling, have contributed to our understanding of chunking in musical experience. But our own research on music-related body motion has singled out intermittency in motor control, i.e. a basically discontinuous and point-by-point control scheme, as an essential factor in chunking. Classical motor control theories with claims of so-called closed loop continuous feedback have in recent years been challenged by models suggesting intermittent control manifest in so-called open loop and preprogrammed motor commands, because continuous feedback loops are thought to be too slow for many highly demanding tasks. Various findings in human motor control, i.e. the so-called psychological refractory period, principles of posture-based motion control, of action hierarchies, of goal-directed behaviour, and our own research on music-related body motion, seem to converge in suggesting the existence of intermittent motor control for chunking at the short-term timescale of very approximately 0.5 seconds duration. Following a so-called motor theory approach, the basic tenet here is that schemas of sound-producing body motion are projected onto whatever musical sound it is that we are hearing. This in turn means understanding chunking by recognizing a number of constraints of body motion and motor control, something that suggest an unequal distribution of attention and effort in musical experience, hence, the idea of chunking by intermittent motor control in music.
-
Godøy, Rolf Inge (2015). Discontinuity within continuity: intermittency in musical experience.
Show summary
Although we may experience music as continuous, as making us feel carried away in an unbroken flow of sound and body motion, we also know that music consists of a series of sonic and body motion events in succession. With events such as tone/sound onsets, or more composite events such as chunks with distinct rhythmical or melodic patterns, we could say that there is a duality of discontinuity and continuity in musical experience. This duality has intrigued phenomenological studies of temporal consciousness, leading Husserl (in dialogue with several of his contemporaries) to suggest that musical experience proceeds by a series of intermittent moments in time, by so-called 'now-points', each including a micro context of the immediate past, present and immediate future expectations. Various researchers in the 20th century have followed up with other ideas for reconciling the discontinuous and continuous, but in recent decades we have seen some significant advances in understanding the behavioral and neurocognitive bases for unit formation, by what we call intermittency in musical experience, closely linked with our present research on body motion in musical experience. Our basic tenet is that we have several concurrent timescales in musical experience, ranging from the very fast (e.g. single vibrations or impulses) to the slower (various singular sounds and/or concatenations of sounds) and very slow (phrases, sections or even whole works of music), but that there is a discontinuity at work both in the production and the perception of musical sound. We see motor control as basically intermittent (as opposed to the continuous control claimed by classical control theory) and proceeding by a series of anticipatory images of future motion chunks. This is linked with a general constraint on action and perception, the so-called psychological refractory period, suggesting that our organism is optimally attuned to controlling and perceiving chunks of action and sound in the approximately 0.5 seconds timescale. These (and some additional) factors converge in suggesting that there are basic intermittency constraints in musical experience, but that with the concatenation of discontinuous chunks in succession, we may also experience continuity at larger timescales in music.
-
Godøy, Rolf Inge (2015). Motormimetic Feature Mapping in Musical Experience.
Show summary
There can be no doubt that we in music often experience correspondences between different sense modalities such as between sound, vision, motion, and touch (just to mention the most prominent ones), evident in dance and other kinds of music-related body motion, and also reflected in listeners' innumerable accounts of visual associations with music and in the ubiquitous use of visual metaphors for musical sound. In short, it should not be controversial to claim that music is a multimodal form of art, involving a number of sensations beyond 'pure sound'. But how the different sense modalities are activated and interact in musical experience, still presents us with a number of unanswered questions. In our research, we have been pursuing the idea of what we call "motormimetic cognition", meaning that we see evidence of an incessant mental simulation of sound-related body motion in music perception, primarily of assumed sound-producing body motion (e.g. hitting, stroking, bowing, blowing), but also of various kinds of sound-accompanying body motion (e.g. dancing, walking, gesticulating). We regard such mental simulation of body motion as applicable to most (perhaps all) features of music, and we believe that motormimetic cognition is an amodal and universally applicable mental activity that can translate between modalities in musical experience. We also believe motormimetic cognition can be the basis for a systematic research effort on feature mapping between different modalities in music, in particular between sound and vision by way of studying body motion trajectory shapes and/or posture shapes in music-related contexts. In my presentation, I shall give an overview of the main ideas of motormimetic cognition and also demonstrate how it is relevant for work with new technologies and in multimedia art.
-
Godøy, Rolf Inge (2015). Sound-Motion Similarity in Musical Experience.
Show summary
Main similarity issues in our sound-motion research: • Cross-modal similarity in music, in particular subjective perception of similarities between sound and body motion • Approximate similarity in imitation: the ability to reproduce salient and readily recognizable features in oral and improvisational musical contexts • Similarity in musical translations: recognizing musical ideas as similar across very different instrumental and/or vocal versions and arrangements A suspected common factor in these (and several other similarity issues) is motor cognition, i.e. our disposition to perceive and think in terms of body motion
-
Nuwer, Rachel; Kobb, Christina Sofie; Godøy, Rolf Inge & Jensenius, Alexander Refsum (2015, 21. juli). Playing Mozart’s Piano Pieces as Mozart Did.
The New York Times.
Show summary
Classical piano pieces by such composers as Beethoven, Mozart and Chopin likely sounded much different when the masters first performed those works than they do today. Pianos themselves have changed considerably — but so, too, has technique.
-
Godøy, Rolf Inge (2014). Coarticulation in the production and perception of music.
Show summary
. The term 'coarticulation' designates the fusion of small-scale events such as single sounds and single sound-producing actions into larger chunks of sound and body motion, resulting in qualitative new features at the medium-scale level of the chunk. Coarticulation has been extensively studied in linguistics and to a certain extent in other domains of human body motion, but so far not so much in music, so the main aim of our lecture is to provide a background for how we can explore coarticulation in music. The contention is that coarticulation in music should be understood as based on a number of physical, biomechanical and cognitive constraints, and that it is an essential shaping factor for several perceptually salient features of music.
-
Godøy, Rolf Inge (2014). Motor constraints shaping musical experience.
Show summary
We have in recent decades seen a surge in publications on embodied music cognition, and it is now broadly accepted that musical experience is intimately linked with experiences of body motion. Going further into this, it is also clear that music performance is not something abstract and without restrictions, but something traditionally (i.e. before the advent of electronic music) also constrained by our possibilities for body motion. There are a number of biomechanical constraints reflected in musical sound, such as maximal speeds of human motion, need for rest, economy of effort, and avoiding strain injury, and there are also constraints of motor control, such as the need for grouping and planning ahead. These constraints often lead to a fusion or contextual smearing of sound producing body motion, in turn also affecting the sound output, effectively contribution to shaping musical sound. One such prominent constraint-based phenomenon is so-called phase-transition, designating the fusion of otherwise singular actions into more superordinate actions with increasing speed of body motion, e.g. as happens when we accelerate the performance of any rhythmical pattern from slow to fast. Another constraint-based outcome is so-called coarticulation, meaning the fusion of otherwise distinct body motions into more superordinate body motion, entailing also a contextual smearing of musical sound. In our research on music-related body motion we see evidence of such body motion constraints on the shaping of musical sound. We can even claim that we expect such constraints reflected in segmentation, phase-transition, and coarticulation in music, hence, that we may speak of a mutual attunement of bodily constraints and perception in music. Such constraint-based phenomena in musical performance could then be seen as an alternative to more traditional notation-based paradigms in music research.
-
Godøy, Rolf Inge (2014). Postures and motion in musical experience.
Show summary
There are innumerable and strong links between sound and body motion in musical experience, as we may readily observe everywhere in listening and performance situations. In what may be broadly called a motor theory perspective, our perception of musical sound is so closely linked with our experiences of sound-producing body motion that music could be understood as a fusion of sound and body motion, i.e. as a composite, multimodal form of art. A fair amount of research in music psychology and other cognitive sciences from the last couple of decades seem to support such a motor theory perspective. In our own research we have looked at how people with different musical training make spontaneous body motion that reflect salient features of sound production, and we have also looked at the actual sound-producing body motion made by professional musicians in various performance situations. In making a summary of our own and others findings, we are now developing a model of sound-motion feature correspondences based on the twin concepts of postures and motion in musical experience. Briefly stated, postures denote the shape and position of sound-producing effectors (fingers, hands, arms, torso, feet, vocal tract) at salient moments in the music, and motion denotes the continuous transition between these postures. The basic idea is that these postures are landmarks, or what we have called goal-points, in the continuous stream of sound and body motion in music, and that they are the basis for the formation of chunks (gestalts, sonic objects) in musical experience. The long-term aim of this work is to enhance our understanding of unit formation in music, or more generally, to understand the interplay of continuity and discontinuity in musical experience.
-
Godøy, Rolf Inge (2014). Quantal elements in music cognition.
-
Godøy, Rolf Inge (2014). Sound and body motion timescales in musical experience.
Show summary
Musical experience, be that in performance or listening, obviously unfolds in time; however, this may be forgotten when we focus on musical features such as style and historical context. When considering musical schemata, it could be useful to clarify the timescales. Granted that we in music have timescales extending from the very short of audible vibrations to the very long of whole works, we also have different schemata at different timescales. This has become particularly evident in our research on music and body motion, which leads us to suggest three main timescales at work in musical experience: • The micro timescale of continuous sound and body motion with features such as pitch, stationary dynamics and timbre, as well as fast fluctuations of these features. • The meso timescale, approximately at the 0.5 to 5 seconds timescale, of what we call chunks or sonic objects. This is the timescale of many salient musical sound features such as rhythm, texture, melodic fragments, modality, expressivity, as well as most salient body motion features. • The macro timescale is that of several meso timescale chunks in succession, such as in sections and whole works of music; this is the scale on which narrative or dramaturgical musical elements are found. Although historically informed listening may variably involve all these timescales, there can be little doubt that the most important is the meso timescale, and this is also the timescale where music-related body motion elements are most clearly manifest. Clearer notions of timescales along these lines could be useful for discussions of schemata in musical experience, and should encourage us to be more critical of various inherited notions of form in Western musical thought.
-
Haugnes, Gunhild M.; Jensenius, Alexander Refsum; Tørresen, Jim & Godøy, Rolf Inge (2014, 11. februar). Musikk + IT = kreativ boom. [Internett].
Institutt for informatikk.
Show summary
Verktøy som påviser CP hos premature barn, musiker som ble skiapp-grunder, utvikling av bukse med innebygde trommer.
-
Smaadal, Camilla; Jensenius, Alexander Refsum & Godøy, Rolf Inge (2014, 29. januar). fourMs-prosjektet ved Universitetet i Oslo. [Internett].
UiO.no.
-
Godøy, Rolf Inge (2013). Coarticulation in the production and perception of music.
Show summary
In the past couple of decades, we have seen much research documenting close links between music and body motion. However, we need to have a better understanding of how meaningful units of sound and body motion are generated and perceived in music. The phenomenon of coarticulation, meaning the fusion of micro-level actions and sonic events into larger and somehow meaningful chunks of sound and motion, could help us not only to better understand sound and body motion links in music, but also contribute to our understanding of expressive and affective features of music. Coarticulation has been extensively studied in linguistics, to a certain extent in human movement science, but not so much in music. In my presentation, I shall give an overview of our own and other research on coarticulation in music.
-
Godøy, Rolf Inge (2013). The Convergence of "Hard" and "Soft" in Music Technology.
Show summary
Music has had a long-lasting relationship with technology, extending from sophisticated mechanical instruments in earlier centuries to present day digital means of music production and distribution. With one foot in technology and another foot in musical aesthetics, the convergence of "hard" and "soft" in music technology is more than ever the case in present music technology research and development. Given the seemingly limitless possibilities of digital music technology to generate any sound, previously heard or unheard, one major challenge now is to develop better means for accessing subjective and affective features of music, in short, to make more musically meaningful man-machine interaction schemes.
-
Godøy, Rolf Inge (2013). Understanding Coarticulation in Music.
Show summary
The term 'coarticulation' designates the fusion of small-scale events such as single sounds and single sound-producing actions into larger chunks of sound and body motion, resulting in qualitative new features at the medium-scale level of the chunk. Coarticulation has been extensively studied in linguis¬tics and to a certain extent in other domains of human body motion, but so far not so much in music, so the main aim of this paper is to provide a background for how we can explore coarticulation in music. The contention is that co¬ar¬ti¬cu¬la¬tion in music should be understood as based on a number of physical, bio¬me¬chanical and cognitive constraints, and that it is an essential shaping fac¬tor for several perceptually salient features of music.
-
Godøy, Rolf Inge; Jensenius, Alexander Refsum & Grønli, Kristin Straumsheim (2013, 18. februar). Musikkens elektroniske fremtid. [Internett].
Forskningsrådets Nyheter.
Show summary
Musikere vil trenge nye, teknologiske ferdigheter og spille digitalt utvidede instrumenter. Lyttere vil gå fra passive konsumenter til å være med å påvirke og produsere.
-
Jensenius, Alexander Refsum; Godøy, Rolf Inge; Johnson, Victoria Kristine Å & Grønli, Kristin Straumsheim (2013, 11. mars). Fremtidens musikere må bygge instrumentet og lage låta underveis. [Internett].
Teknisk ukeblad.
-
Nymoen, Kristian; Tørresen, Jim; Godøy, Rolf Inge; Jensenius, Alexander Refsum & Høvin, Mats Erling (2013). Methods and Technologies for Analysing Links Between Musical Sound and Body Motion. Series of dissertations submitted to the Faculty of Mathematics and Natural Sciences, University of Oslo.. 1291.
Show summary
There are strong indications that musical sound and body motion are related. For instance, musical sound is often the result of body motion in the form of sound-producing actions, and musical sound may lead to body motion such as dance. The research presented in this dissertation is focused on technologies and methods of studying lower-level features of motion, and how people relate motion to sound. Two experiments on so-called sound-tracing, meaning representation of perceptual sound features through body motion, have been carried out and analysed quantitatively. The motion of a number of participants has been recorded using state-of- the-art motion capture technologies. In order to determine the quality of the data that has been recorded, these technologies themselves are also a subject of research in this thesis. A toolbox for storing and streaming music-related data is presented. This toolbox allows synchronised recording of motion capture data from several systems, independently of system-specific characteristics like data types or sampling rates. The thesis presents evaluations of four motion tracking systems used in research on music-related body motion. They include the Xsens motion capture suit, optical infrared marker-based systems from NaturalPoint and Qualisys, as well as the inertial sensors of an iPod Touch. These systems cover a range of motion tracking technologies, from state-of-the-art to low-cost and ubiquitous mobile devices. Weaknesses and strengths of the various systems are pointed out, with a focus on applications for music performance and analysis of music-related motion. The process of extracting features from motion data is discussed in the thesis, along with motion features used in analysis of sound-tracing experiments, including time-varying features and global features. Features for realtime use are also discussed related to the development of a new motion-based musical instrument: The SoundSaber. Finally, four papers on sound-tracing experiments present results and methods of analysing people’s bodily responses to short sound objects. These papers cover two experiments, presenting various analytical approaches. In the first experiment participants moved a rod in the air to mimic the sound qualities in the motion of the rod. In the second experiment the participants held two handles and a different selection of sound stimuli was used. In both experiments optical infrared marker-based motion capture technology was used to record the motion. The links between sound and motion were analysed using four approaches. (1) A pattern recognition classifier was trained to classify sound-tracings, and the performance of the classifier was analysed to search for similarity in motion patterns exhibited by participants. (2) Spearman’s p correlation was applied to analyse the correlation between individual sound and motion features. (3) Canonical correlation analysis was applied in order to analyse correlations between combinations of sound features and motion features in the sound-tracing experiments. (4) Traditional statistical tests were applied to compare sound-tracing strategies between a variety of sounds and participants differing in levels of musical training. Since the individual analysis methods provide different perspectives on the links between sound and motion, the use of several methods of analysis is recommended to obtain a broad understanding of how sound may evoke bodily responses.
-
Godøy, Rolf Inge (2012). Continuity and discontinuity in music-related motion.
Show summary
The many and close links between sound and body motion in music seem now to be well documented, and it seems fair to claim that sensations of body motion are very often (or perhaps always) integral to musical experience. But in spite of enhanced methods for studying music-related body motion in the last couple of decades, we still have substantial challenges in understanding how such body motion is perceived and conceived by musicians and listeners alike. One main question here is how continuous streams of sound and body motion are segmented into somehow meaningful chunks, in other words, how continuity and discontinuity interact in our subjective experience. In our research, we have found it useful to distinguish between different timescales of sound and body motion, and furthermore, to focus on what we call the meso-level timescale with chunks of sound and body motion in the approximately 0,5 to 5 seconds duration range. At this timescale, we believe sensations of continuity and discontinuity coexist in holistically perceived chunks of sound and body motion, and that this coexistence is based on the convergence of various physical, biomechanical, neurocognitive and musical-aesthetical constraints. In my talk, I shall give a summary of past and present research on this topic, including practical applications of our ideas here to various music-related body motion data.
-
Godøy, Rolf Inge (2012). Postures, Trajectories, and Sonic Shapes.
Show summary
During the last decades, there has been a growing interest in the relationships between sound and body motion in music, resulting in several publications claiming that music is multimodal, i.e. that music in addition to sound also includes elements of body motion such as kinematics (visual images of motion trajectories), dynamics (sense of motion effort) and haptics (sense of touch): we hear the sound of a musical performance and at the same time see (or imagine) the body motions of the performers and mentally simulate the effort and sense of touch related to the performance. One common feature of these multimodal elements in music is the notion of shape: we see or imagine the shape of the body motion trajectories, of the fluctuating effort, and of the tactile experience of playing the instruments (or the motions of the vocal apparatus in the case of singing). Also, notions of shape are well established in the perceptual attributes of sound as so-called envelopes, both in the overall dynamic unfolding of sounds and in the stable, as well as in the evolving, or even transient spectral content of sounds. And needless to say, notions of shape are integral to our Western conceptual apparatus as reflected in common practice music notation (and its more recent extensions such as MIDI) for representing e.g. melodic, textural and intensity shapes. Given this background, the focus of my presentation will be on modelling shape in musical experience by sequences of key-postures of the effectors (fingers, hands, arms, torso, etc.) at salient moments in the musical performance (downbeats and other accents), with continuous and so-called coarticulated (fused) motion trajectories between these key-postures. Based on evidence from so-called motor theories of perception, sonic shapes can be linked with the shapes of such key-postures and trajectories, enhancing our understanding of music as multimodal embodied shapes.
-
Godøy, Rolf Inge (2012). Sonic Object Design.
-
Godøy, Rolf Inge (2012). Thinking Shapes in Musical Experience.
-
Godøy, Rolf Inge; Andresen, Kari; Jensenius, Alexander Refsum & Thomsson, Annica (2012, 25. juli). Lyd for kroppen. [Internett].
uio.no.
Show summary
Musikk og bevegelse hører sammen. Hva skjer så i hodet når du sitter helt stille? Nå forskes det på forbindelsen mellom kroppen og lydene.
-
Godøy, Rolf Inge; Jensenius, Alexander Refsum; Voldsund, Arve; Glette, Kyrre Harald; Høvin, Mats Erling; Nymoen, Kristian; Skogstad, Ståle Andreas van Dorp & Tørresen, Jim (2012). Classifying Music-Related Actions.
Show summary
Our research on music-related actions is based on the conviction that sensations of both sound and body motion are inseparable in the production and perception of music. The expression "music-related actions" is here used to refer to chunks of combined sound and body motion, typically in the duration range of approximately 0.5 to 5 se¬conds. We believe that chunk-level music-related ac¬tions are highly signifi¬cant for the experience of music, and we are pres¬ently working on establishing a database of music-related actions in order to facilitate access to, and research on, our fast growing collection of motion capture data and related material. In this work, we are con¬fronted with a number of perceptual, concep¬tual and technological is¬sues regarding classification of music-related ac¬tions, issues that will be presented and discussed in this paper.
-
Godøy, Rolf Inge (2011). Coarticulation in Music-Related Gestures.
Show summary
In our research on music-related gestures (http://www.fourms.uio.no/), we have come to believe that the phenomenon of coarticulation plays an essential role in both the production and the perception of music. Coarticulation here means the fusion of singular actions and sound-events into more superordinate continuous movements and sound passages, e.g. the singular rapid finger movements and sounds of a piano performance fused into more superordinate hand/arm movements and continuous melodic contours. Coarticulation is well known in linguistics and in some human movement sciences, but relatively little studied in music. However, the few available studies of coarticulation in music as well as our own video and motion capture data seem to show coarticulation at work in relation to singular sound-events, and analyses of the sound similarly show contextual smearing of sound events that are the hallmarks of coarticulation. With a recognition of coarticulation at work in the production and perception of music, we believe we better can understand how various contextual effects emerge in music, i.e. that various phenomena such as rhythmic, textural, and melodic patterns can be understood as shaped by coarticulation.
-
Godøy, Rolf Inge (2011). Images of sound, postures and trajectories in music. Keynote lecture, the Embodiment-Experiment Seminar, Department of Music, University of York, May 10th-11th, 2011.
-
Godøy, Rolf Inge (2011). Sonic feature timescales and music-related actions.
-
Godøy, Rolf Inge (2011). Sound-Action Timescales. Lecture at the International Summer School in Systematic Musicology, Jyväskylä, Finland, 08/08/11-18/08/11.
Show summary
In our ongoing research, we seek to correlate different sonic feature timescales with sensations of body movements, ranging from fast (e.g. trembling, shaking, etc.), to slower (e.g. whole arm movement), to slow (e.g. torso, whole body movement), and also to quasi-stationary body postures. In many cases, there are clear causal relationship between sound-producing actions of musicians and emergent sonic features (e.g. tremolo movements of the hand and tremolo sounds), causal relationships that seem to be readily perceived by listeners. But images of embodied energy patterns in sound can also be extended into generic categories applicable to sounds regardless of origin, providing a conceptual apparatus for categorizing sonic features in music theory, music analysis and music information retrieval.
-
Godøy, Rolf Inge & Bjørkeng, Per Kristian (2011, 24. januar). Kropp og sinn i ett og alt.
Aftenposten.
Show summary
Intervju med Rolf Inge Godøy og kolleger om kroppsrelatert opplevelse i musikk.
-
Kozak, Mariusz; Nymoen, Kristian & Godøy, Rolf Inge (2011). The Effects of Spectral Features of Sound on Gesture Type and Timing.
-
Glette, Kyrre Harald; Jensenius, Alexander Refsum & Godøy, Rolf Inge (2010). Extracting action-sound features from a sound-tracing study.
Show summary
The paper addresses possibilities of extracting information from music-related actions, in the particular case of what we call sound-tracings. These tracings are recordings from a graphics tablet of subjects' drawings associated with a set of short sounds. Although the subjects' associations to sounds are very subjective, and thus the resulting tracings are very different, an attempt is made at extracting some global features which can be used for comparison between tracings. These features are then analyzed and classified with an SVM classifier.
-
Godøy, Rolf Inge (2010). Musical Gestures: Sound, Movement, and Meaning - En bokpresentasjon.
-
Godøy, Rolf Inge (2010). Music-related Actions.
Show summary
The close links between sound and movement are ubiquitous in musical performance, listening, or innumerable everyday situations. Body movements (real or imagined) seem so integral to musical experience that it is hard to think of music without also thinking of body movement. Increasing interest in studying music-related body movement has given us improved methods and technologies for our research, yet one of the most intriguing issues is how we conceptualize and represent sound and movement in our minds as meaningful actions: how can we have more or less solid images of sound and movement as these in their very nature are transient and ephemeral? The question is both conceptual and pragmatic as it directly concerns how we capture, process, and represent sound and movement data. Our strategy is to focus on fragments of sound and movement, on what we call music-related actions at the chunk-level, and in my presentation I shall give an overview of the main elements of our ongoing research here.
-
Godøy, Rolf Inge (2010). Sound Shapes.
Show summary
We have seen important advances in musical acoustics, psychoacoustics, and more recently in embodied music cognition, but we still seem to lack a good conceptual apparatus for speaking about subjectively experienced sonic features in music. Inspired by the seminal work of Pierre Schaeffer and coworkers half a century ago, one of our long-term goals is trying to bridge the gap between subjectively experienced sonic features expressed in various tactile and/or kinematic metaphors such as rough, smooth, narrow, open, thick, thin, etc., and corresponding sound signal features. Such tactile-kinematic metaphors can be collectively called 'sound shapes', and could also be useful for musical aesthetics as a conceptual apparatus for speaking about subjectively experienced sonic features.
-
Jensenius, Alexander Refsum; Glette, Kyrre Harald; Godøy, Rolf Inge; Høvin, Mats Erling; Nymoen, Kristian; Skogstad, Ståle Andreas van Dorp & Tørresen, Jim (2010). fourMs, University of Oslo – Lab Report.
-
Jensenius, Alexander Refsum & Godøy, Rolf Inge (2010). Input technologies for music-related actions.
-
Godøy, Rolf Inge (2009). Anticipatory chunking of music-related actions.
Show summary
One major element in music-related body movement is the emergence of meaningful units of sound and movement, what we call chunking. Chunking is often explained by various gestalt-like principles such as closure or belonging, or by qualitative discontinuities such as shifts between sound and silence, or by detecting repetitions of rhythmic, melodic, timbral, etc. patterns, hence essentially by looking at the signal (be that in sound and/or in movement). As a supplement to these signal-based cues for chunking, we now turn to the role of anticipatory cognition, meaning to the preparatory elements in movement and the control of movement. Anticipatory elements are clearly observable in the phenomenon of coarticulation, meaning the subsumption and contextual smearing of otherwise separate actions and sounds into more superordinate units, so that the shape and position of the effectors (lips, vocal tract, fingers, hands, etc.) at any moment are determined by what to do next (as well as by what was just done). There is now converging evidence from various behavioral research for the existence of anticipatory chunking, suggesting that we may conceive of a chunk of music-related movement by an "instantaneous" overview image, "in-a-now", as was suggested by phenomenological philosophers more than 100 years ago. The challenge now is to substantiate these ideas of anticipatory chunking in our research on music-related movement.
-
Godøy, Rolf Inge (2009). Chunking sound-actions in musical experience.
Show summary
One of the major challenges in studying sound-action relationships in music is that of how somehow meaningful units of sound and action emerge from the continuous stream of sensations, a process we like to call chunking. It seems that chunking is related to a number of constraints for body movement and for sound perception, as well as related to various musical-aesthetical elements. In this lecture, I will give an overview of some current theories of chunking as well as demonstrate how we are trying to study chunking in our ongoing research.
-
Godøy, Rolf Inge (2009). Sound, Movement, Key-Frames and Inter-Frames.
Show summary
Close links between sound and movement are ubiquitous in musical performance, listening, or innumerable everyday situations. Body movements (real or imagined) seem so integral to musical experience that it is hard to think of music without also thinking of body movement. Increasing interest in studying music-related body movement has given us improved methods and technologies for our research, yet one of the most intriguing issues is how we conceptualise and represent sound and movement in our minds and in our research: How can we have more or less solid images of sound and movement as they in their very nature are transient and ephemeral? The question is both conceptual and pragmatic as it directly concerns how we capture, process, and represent sound and movement data. After years of theoretical reflection alternating with practical work, our solution is to regard music-related movement as focused around key-frames, meaning salient postures in time, interleaved with inter-frames, meaning continuous movement between the key-frames. Borrowed from film animation and now applied in human movement science, we believe key- frames and inter-frames correspond to similar elements in the sound, giving us a coherent framework for studying music-related movement.
-
Godøy, Rolf Inge & Jensenius, Alexander Refsum (2009). Body Movement in Music Information Retrieval.
-
Godøy, Rolf Inge & Jensenius, Alexander Refsum (2009). Typomorphological features of sonic objects.
-
Godøy, Rolf Inge; Jensenius, Alexander Refsum & Nymoen, Kristian (2009). Chunking by coarticulation in music-related gestures.
-
Godøy, Rolf Inge; Jensenius, Alexander Refsum & Nymoen, Kristian (2009). Coarticulation of sound and movement in music.
-
Jensenius, Alexander Refsum; Skogstad, Ståle Andreas van Dorp; Nymoen, Kristian; Godøy, Rolf Inge; Tørresen, Jim & Høvin, Mats Erling (2009). Reduced displays of multidimensional motion capture data sets of musical performance.
Show summary
Background: Carrying out research in the field of music and movement involves working with different types of data (e.g. motion capture and sensor data) and media (i.e. audio, video), each having its own size, dimensions, speed etc. While each of the data types and media have their own analytical tools and representation techniques, we see the need for developing more tools that allow for studying all the data and media together in a synchronised manner. We have previously developed solutions for studying musical sound and movement in parallel by using synchronised spectrograms of audio and motiongrams of video. Now as we have started using an infrared motion capture system in our research, we see the need for better visualisation techniques of the highly multidimensional data sets being recorded (e.g. 50 markers x 3 dimensions x 100 Hz). While there are several techniques for doing this independently of audio and video, we are working on tools that integrate well with our displays of spectrograms and motiongrams. Aims: Creating reduced representations of multidimensional motion capture data of complex music-related body movement that can be used together with spectrograms and motiongrams. Results/Main Contribution: We present some of the visualisation techniques we have been developing to display multidimensional data sets: 1) reduction based on collapsing dimensions, 2) reduction based on frame differencing, 3) colour coding of movement features. We show how these techniques allow for displaying reduced displays of multidimensional motion capture data sets synchronised with spectrograms and motiongrams. Conclusions/Implications: The techniques presented allows for studying relationships between movement and sound in music performance, and make it possible to create visual displays of movement and sound that can be used on screen and in printed documents.
-
Godøy, Rolf Inge (2008). Chunking Sound for Musical Analysis.
Show summary
One intriguing issue in music analysis is that of segmentation, or parsing, of continuous auditory streams into some kinds of meaningful and analytically convenient units, a process I here prefer to denote as chunking. The purpose of this paper is to present a theory of chunking in musical analysis based on recent ideas of embodied auditory cognition and our own research on musical gestures (http://musicalgestures.uio.no). Although the topic of chunking in sound has been discussed from the time of early music-related gestalt theory and phenomenology at the end of the nineteenth century up to present theories of auditory perception, there can be no doubt that the most consistent focus on chunking of musical sound may be found in the theoretical works of Pierre Schaeffer with his ideas of the fragment, of the sonic object as the most significant phenomenon in music (Schaeffer 1966). Typically, the sonic object in Schaeffer's theory is in the range of a few seconds, what I here call a meso-size chunk. Interestingly, recent neurocognitive research seems to agree with the idea of attention spans of approximately three seconds (Pöppel 1997, Varela 1999). And as to the human action side, there seems to be a similar convergence of human actions to such meso-size chunks in the approximately three-second range (Schleidt and Kien 1997). Our own observation studies of sound-related gestures also seem to converge on this size as a 'normal' size chunk of music-related movement (Godøy 2006a, Godøy, Haga, and Jensenius 2006a and 2006b). In this paper, I shall present converging evidence in support of the primordial role of such meso-size chunks in music perception and cognition, and argue that such meso-size chunks also should be the basis for musical analysis, as well as present various musical examples to illustrate this.
-
Godøy, Rolf Inge (2008). Goal-points and trajectories in music-related movement.
Show summary
In our research on music-related movement (http://musicalgestures.uio.no), we have seen that listeners with very different levels of musical training all seem to be able to imitate sound-producing gestures suggested by the music, evident in various kinds of 'air instrument' performance such as air guitar, air drums, air piano, etc. We understand this in the framework of 'embodied cognition', meaning that perception is closely linked with incessant mental simulations of body movements (Gallese and Metzinger 2003). This means that we make sense of what we see, hear, feel, etc., by mentally simulating (and sometimes also overtly carrying out) various body movements, both our own and those of others, associated with whatever we perceive and think. Also, the various 'air performance' gestures and other sound-related gestures we have studied seem centered on certain salient points in the music such as various accents (downbeats or other kinds of accents) or melodic or textural peaks. We understand this rendering of salient points as goal-directed behavior, meaning robust perception and rendering of the goals of movements and more variability or 'inaccuracies' in the movement trajectories between these goals (W o h l s c h l ä g e r et al. 2003). We use the expression 'goal-points' to denote this phenomenon, meaning the shape or posture and the positions of the effectors (e.g. shape and position of the hands on the keyboard, angle and position of hands and arms in relation to the drums, etc.) at certain points in time. Between these goal-points, we have more or less continuous movement trajectories, however these trajectories are subordinate to the goal-points. We thus see music-related movements (both sound-producing and sound-accompanying movements) as organized around such a succession of goal-points, and this may have significant consequences not only for how we interpret music-related movement, but also for how we segment or chunk musical sound in general.
-
Godøy, Rolf Inge (2008). Sound Actions: Human movement in the perception and cognition of music.
Show summary
We can see people moving to music everywhere: in dancing, in marching, in all kinds of everyday private or not so private listening situations like in walking down the street making movements to the music of an iPod, or at concerts (provided it is socially acceptable for listeners to move), and of course in the performance of music. Listeners, regardless training or level of expertise, seem to be able to spontaneously make movements that more or less reflect various salient features of the music. This makes us believe that human movement is an integral part of not only music perception, but also of music cognition in general, in that we may remember and imagine music as movements and not only as "pure sound". In this lecture, various research findings on the intimate links between human movement and music will be reviewed, and the consequences these findings could (or should) have for other areas of music research will be discussed. In particular, the issue of segmentation of music-related movements into somehow meaningful action chunks will be focused on, suggesting that various biomechanical and motor control elements may be influential in how we perceive and/or imagine musical sound.
-
Godøy, Rolf Inge & Jensenius, Alexander Refsum (2008, 26. september). Norges første musikk og bevegelseslab. [TV].
NRK 1 Kulturnytt.
-
Godøy, Rolf Inge; Jensenius, Alexander Refsum & Landsverk, Johanne (2008, 03. januar). Musikk = kroppsrørsle.
Forskerforum.
-
Godøy, Rolf Inge; Jensenius, Alexander Refsum & Nymoen, Kristian (2008). Production and perception of goal-points and coarticulations in music.
Show summary
From our studies of sound-related movement (http:-slash-slash musicalgestures.uio.no), we have reason to believe that both sound-producing and sound-accompanying movements are centered around what we call goal-points, meaning certain salient events in the music such as downbeats, or various accent types, or melodic peaks. In music performance, these goal-points are reflected in the positions and shapes of the performers' effectors (fingers, hands, arms, torso, etc.) at certain moments in time, similar to what is known as keyframes in animation. The movement trajectories between these goal-points, similar to what is known as interframes in animation, may often demonstrate the phenomenon of coarticulation, i.e. that the various smaller movement are subsumed under more superordinate and goal-directed movement trajectories. In this paper, we shall present a summary of recent human movement research in support of this scheme of goal-points and coarticulations, as well as demonstrate this scheme with data from our ongoing motion capture studies of pianists' performance and other researchers' motion capture data. ©2008 Acoustical Society of America
-
Godøy, Rolf Inge & Nymoen, Kristian (2008, 31. oktober). Rørsle blir musikk.
Uniforum.
-
Jensenius, Alexander Refsum; Nymoen, Kristian & Godøy, Rolf Inge (2008). A Multilayered GDIF-Based Setup for Studying Coarticulation in the Movements of Musicians.
Show summary
The paper presents some challenges faced in developing an experimental setup for studying coarticulation in music-related body movements. This has included solutions for storing and synchronising motion capture, biosensor and MIDI data, and related audio and video files. The implementation is based on a multilayered Gesture Description Interchange Format (GDIF) structure, written to Sound Description Interchange Format (SDIF) files using the graphical programming environment Max/MSP.
-
Lillebo, Maria Røbech & Godøy, Rolf Inge (2008, 26. oktober). Sanger som fester seg på hjernen. [Internett].
P4.
-
Godøy, Rolf Inge (2007). Chunking sound in listening and analysis.
Show summary
Short abstract: One intriguing issue in the analysis of electroacoustic music (and other kinds of music as well) is the segmentation or parsing of continuous auditory streams into meaningful and analytically convenient units, a process I here denote as chunking. This paper shall present elements of a theory of chunking and propose a three-layer model that can accommodate musical features at three different time-scales: i) Micro-level (or sub-chunk level), focused on the content of the chunk, what Schaeffer called its contexture, including features such as grain and motion. ii) Meso-level (or chunk-level), focused on the overall shape-features of the chunk, corresponding to Schaeffer's typological categories. iii) Macro-level (or supra-chunk level), consisting of the cumulative memory of several successive chunks, as in the case of longer passages of music. This three-level model is reasonably well founded and could be convenient for analytical purposes, something that will be illustrated with sound examples during the presentation.
-
Godøy, Rolf Inge (2007). Geometry and effort in gestural renderings of musical sound.
Show summary
In our current research on music-related gestures (http://musicalgestures.uio.no), we have had a particular focus on the spontaneous gestures that listeners make to musical sound. This has been motivated by the belief that perception and cognition of musical sound is intimately linked with mental im-ages of movement, and that a process of incessant motor imagery is running in parallel with listening to, or even just imagining, musical sound. We have called this motormimetic cog-nition, and see evidence for this in a number of research findings as well as in our own observation studies. Furthermore, we believe hand movements have a privileged role in motormimetic cognition of musical sound, and that these hand movements may trace the geometry (i.e. elements such as pitch contours, pitch spread, rhythmical patterns, textures, and even timbral elements as shapes) as well as convey sensations of effort of musical sound, hence the focus in this paper on geometry and effort in the gestural renderings of musical sound. There are many different gestures that may be associated with music. Using the Gibsonian concept of affordance, we can thus speak of rich ges-tural affordances of musical sound. For practical purposes we can in this paper think of two main categories, sound-producing gestures (such as hitting, stoking, bowing) and sound-accompanying gestures (such as dancing, marching, making various movements to the mu-sic), as well as several sub-categories of these. The distinction between these two main categories as well as their sub-categories may not always be so clear (e.g. musicians make gestures in performance that are probably not strictly necessary for producing sound, but may be useful for reasons of motor control or physiological comfort, or have communica-tive functions towards other musicians or the audience). But in order to carry out more systematic observation studies of gestural renderings, we have proceeded from giving subjects rather well-defined tasks with limited gestural affor-dances onto progressively more open tasks with quite rich gestural affordances, meaning proceeding from studies of air-instrument performances where subjects were asked to make sound-producing movements, to what we have called sound-tracing studies where the musical excerpts were quite restricted as to their number of salient features, on to what we called free dance gestures with more complex, multi-feature excerpts and rather general instructions to subjects about making spontaneous gestural renderings based on what they perceived as the most salient features. The idea of gestural rendering of musical sound is based on a large body of research ranging from classical motor theory of perception to more recent theories of motor in-volvement in perception in general, and more specifically in audio perception, as well as in music related tasks in particular. Obviously, auditory-motor couplings as well as the capacity to render and/or imitate sound is not restricted to hand movements, as is evident from vocal imitation of both non-musical and musical sound (e.g. so-called beat-boxing in hip-hop and other music and scat singing in jazz). But the focus on hand movements in our case is based not only on innu-merable informal observations of listeners making hand movements to musical sound, but also on the belief that hand movements have a privileged role from an evolutionary point of view and from a general gesture-cognitive point of view. Furthermore, we believe that a listener through a process of translation by the principle of motor equivalence, may switch from one set of effectors to another, revealing more amodal gestural images of musical sound.
-
Godøy, Rolf Inge (2007). Gestural-Sonorous Objects.
-
Godøy, Rolf Inge (2007). Gesture research in music composition context.
-
Godøy, Rolf Inge (2007). Temporal phenomena and enigmas in the perception and cognition of musical sound.
-
Godøy, Rolf Inge (2006). Coarticulated gestural-sonorous objects in music.
-
Godøy, Rolf Inge (2006). Gestural-Sonorous Awareness in Musical Imagery.
Show summary
One intriguing issue in research on musical imagery (e.g. various contribu-tions in Godøy and Jørgensen 2001) has been the relationship between sonorous and gestural images in our consciousness. Informal accounts by musicians and neurocognitive studies (e.g. Zatorre and Halpern 2005 and various references given there) seem clearly to support the idea of close links between auditory and motor ele-ments in musical imagery. Furthermore, earlier conceptual work on embodied cognition (e.g. Johnson 1987) and related neurocognitive work (e.g. Berthoz 1997) now seem to fuse into a coherent understanding of bodily movement as a general basis for cogni-tion and con-sciousness (e.g. Gallese and Lakoff 2005). Lastly, our own current studies of musical ges-tures (http://musicalgestures.uio.no) seem to indicate that listen-ers, even non-experts (novices), have spontaneous and fairly robust images of sound-pro-ducing gestures (Godøy, Haga, and Jensenius 2006), leading us to the idea of ges-tural-so-norous awareness in the perception and imagery of music. This paper will briefly review the abovementioned research and propose a model for understanding how gestural images are integral to our aware-ness of musical sound, and furthermore link this with phenomenological theory of internal temporal con-sciousness in music (Husserl 1893), and finally suggest some practical applications of volitional, "gesture-guided" mu-si-cal imagery. Keywords: Consciousness, awareness, intentionality, gesture, sound, musical imagery, motor im-agery.
-
Godøy, Rolf Inge; Haga, Egil & Jensenius, Alexander Refsum (2006). Exploring Music-Related Gestures by Sound-Tracing - A Preliminary Study.
-
Jensenius, Alexander Refsum; Godøy, Rolf Inge & Kvifte, Tellef (2006). Towards a gesture description interchange format.
Show summary
This paper presents our need for a Gesture Description Interchange Format (GDIF) for storing, retrieving and sharing information about music-related gestures. Ideally, it should be possible to store all sorts of data from various commercial and custom made controllers, motion capture and computer vision systems, as well as results from different types of gesture analysis, in a coherent and consistent way. This would make it possible to use the information with different software, platforms and devices, and also allow for sharing data between research institutions. We present some of the data types that should be included, and discuss issues which need to be resolved.
-
Jensenius, Alexander Refsum; Kvifte, Tellef & Godøy, Rolf Inge (2006). Towards a Gesture Description Interchange Format [Poster].
-
Godøy, Rolf Inge (2005). Embodied Phenomenological Music Theory.
-
Godøy, Rolf Inge (2005). Gestural Sonorous Objects: Re-thinking Schaeffer's Typo-morphological concepts.
Show summary
In this paper, I will try to show how Pierre Schaeffer's focus on fragments of musical sound, on what he called sonorous objects (Schaeffer 1966), can be re-interpreted as intimately linked with mental images of action fragments, with what I here call gestural objects. To demonstrate these gestural-sonorous object links, I will briefly present some relevant concepts from Schaeffer's work, some ideas from recent work on embodied cognition, and conclude with some Schaeffer-inspired elements in our on-going research on gesture-based explorations of musical sound and the relevance of this for the analysis of electro-acoustic music. Essentially, this means rethinking Schaeffer's concepts of sonorous objects as gesture-related concepts, or as procedural knowledge, i.e. as active knowledge of movement.
-
Godøy, Rolf Inge; Haga, Egil & Jensenius, Alexander Refsum (2005). Playing "Air Instruments": Mimicry of Sound-producing Gestures by Novices and Experts.
-
Godøy, Rolf Inge & Ng, Kia (2005). COST287-ConGAS Project Presentation.
-
Jensenius, Alexander Refsum; Godøy, Rolf Inge & Spilde, Ingrid (2005, 01. oktober). Mellom teknologi og musikk. [Internett].
Forskning.no.
Show summary
Musikkteknologi er et temmelig ferskt fagfelt. 30. september og 1. oktober samles fagfolk fra hele landet på Musikkteknologidagene 2005, for å diskutere form og framtid.
-
Jensenius, Alexander Refsum; Gupta, Ram Eivind; Godøy, Rolf Inge; Haga, Egil; Aksnes, Hallgjerd & Kristoffersen, Kristian Emil (2005). Kroppslyd.
-
Jensenius, Alexander Refsum; Godøy, Rolf Inge & Wanderley, Marcelo M. (2005). Developing Tools for Studying Musical Gestures Within the Max/MSP/Jitter Environment.
Show summary
We present the Musical Gestures Toolbox, a collection of Max/MSP/Jitter modules to help in qualitative and quantitative analysis of musical gestures. Examples are shown of how the toolbox is used for studying musical mimicry, such as "air piano" performance, and expressive gestures of musicians.
-
Godøy, Rolf Inge (2004). Motor-Mimetic Cognition: Mediating Naturalistic and Culturalistic Approaches in Musicology.
-
Godøy, Rolf Inge (2004). Musical Gestures Research.
-
Godøy, Rolf Inge; Haga, Egil & Jensenius, Alexander Refsum (2004). Motormimetic sketching and the novice-expert continuum.
-
Jensenius, Alexander Refsum; Godøy, Rolf Inge; Haga, Egil & Aksnes, Hallgjerd (2004). Musical Gestures.
-
Godøy, Rolf Inge; Jensenius, Alexander Refsum & Borchgrevink, Hild (2004, 29. april). Seminar innleder prosjekt om musikk og gestikk. [Internett].
MIC.
Show summary
I forbindelse med oppstart av et forskningsprosjekt om musikalsk gestikk inviterer Institutt for musikk og teater ved UiO til et seminar over dette emnet 14. og 15. mai. Prosjektet, som involverer både norske og internasjonale forskningsmiljøer, skal undersøke sammenhenger mellom musikk, menneskelige bevegelser og musikkens begreper. Prosjektet ledes av Rolf Inge Godøy ved IMT. Seminaret er åpent for alle.
-
Paulsen, Cathrine TH & Godøy, Rolf Inge (2004, 12. juli). Luftgitaren viktigere enn du tror. [Internett].
Forskning.no.
-
Godøy, Rolf Inge (2003). Gestural Imagery in the Service of Musical Imagery.
-
Godøy, Rolf Inge (2003). Motor-Mimetic Music Cognition. Leonardo: Journal of the International Society for the Arts, Sciences and Technology.
ISSN 0024-094X.
36(4), s 317- 319
-
Godøy, Rolf Inge (2003). The Musical Gestures Project 2004-2007.
-
Godøy, Rolf Inge (2002). L'imagerie musicale.
-
Godøy, Rolf Inge (2001). Simple yet complex: Action-images in musical thought.
Show summary
The ecological bases for musical practice is a promising field for musicology in the future. Abandoning traditional symbol-based approaches, ecological approaches take as the point of departure the continuous audio-acoustic flux, as well as the human capacity for extracting focused and relatively stable sound-objects from this flux. Images of sound-producing actions play an important role in this emergence of focused sound-objects from the continuous flux, hence the idea of music cognition by action-images in musical thought. There are of course many unresolved questions and challenges in such an approach, however, action-images hold promise for elucidating some enigmatic elements in music cognition such as that of parsing, chunking, and temporal coding.
-
Godøy, Rolf Inge (2001). Temporal re-coding by action-images.
Show summary
The basic idea in this paper is that of an ecologically conditioned re-coding of musical sound into action-images when we listen to music, and hence, that temporal coding in music perception and cognition could be understood as a matter of forming motor images.
-
Godøy, Rolf Inge (2000). Cognition musicale par mimetisme moteur.
Show summary
La capacité de percevoir et imaginer des sons musicaux par images de la production dès sons, c'est à dire, de mentalement simuler des actions qu'on suppose est derrière ce qu'on entend.
-
Godøy, Rolf Inge (1999). Imagined Action, Excitation and Resonance.
Show summary
The aim of this paper is to show how images of sound-producing actions can enhance our capacity for imagining sonorous qualities by presenting a conceptual model of imagined sound production which separates excitation and resonance, classifies different modes of excitation and resonance, and relates all this to the notions of motor programs, motor equivalence and coarticulation in motor imagery. Various research supporting this model will be presented together with some ideas for further research, the conclusion being that most sounds can be perceived as included in action-trajectories, and that in the case of musical imagery, this exploration of the "silent" choreography of sound-producing actions opens up for differentiations and hence also an enhancement of our means for imagery of sonorous qualities.
-
Godøy, Rolf Inge (1999). Lyden og kroppen. Parergon.
ISSN 0313-6221.
Show summary
En kortfattet ytring om framtiden for komposisjonsfaget og lydforskningen.
-
Godøy, Rolf Inge (1999). Shapes and Spaces in Musical Thinking.
Show summary
This is the first draft of a book on musical analysis based on the idea of knowledge in music theory as shapes of various trajectories in timespace. Music perception and cognition is here understood as being fundamentally cross-modal, and as in particular involving action and vision in addition to audition. A triangular model of action, sound and vision is presented, and various issues of representation are discussed. An analysis-by-synthesis approach to musical sound is regarded as the most suited to give us knowledge of previously not well explored elements such as timbre, texture and contour, and various issues of the relationship between complex subsymbolic substrates of musical sound and more singular and focused notions of musical objects are discussed.
-
Schneider, Albrecht & Godøy, Rolf Inge (1999). Perspectives and Challenges.
Show summary
Although we have tried to define musical imagery as "our mental capacity for imagining musical sound in the absence of a directly audible sound source", this may in turn imply a number of different approaches and paradigms. Actually, this pluralism of approaches is fortunate, as we believe that the shifts of perspectives caused by the various paradigms are fruitful and constructive. In particular, the phenomenological and gestaltist schools of thought at the end of the nineteenth century and early twentieth century seem to still have relevance for our work today. We shall also try to see the topic of musical imagery in relation to other topics in music cognition, as there in many cases will be rather unclear boundaries here (e.g. in relation to auditory memory, various components of audition, music and bodily movement, etc.), ending up with a summary of what we see as some of the key questions in the field of musical imagery at the present stage.
View all works in Cristin
Published May 19, 2006 12:00 AM
- Last modified Jan. 10, 2019 9:36 AM