Interactive Music Systems

Computer systems that support and extend musical listening, performance, and collaboration.

A collection of musical interactions on phones, tablets, and novel electronic devices.

Interactive music systems support sonic creativity, and invite exploration from a broad audience.

Interactive Music Systems are computer systems for extending and enhancing the possibilities of musical listening, performance, and collaboration. Throughout the 20th century, electronic systems have been used to create new kinds of musical intruments and sonic devices that have brought musical appreciation to a very wide audience and greatly enhanced the musical creativity available to novices. In recent times, computer systems have been used to design very flexible musical experiences, such as NIMEs (New Interfaces for Musical Expression), where various sensors, processors, and outputs can be configured to serve a multitude of musical purposes for listeners, performers, and those somewehere in between. In the last fifteen years, such interfaces have gone from curious experiments to popular apps and commercial hardware products.

Recent advances in computing suggest further possibilities for playing, hearing, and enjoying music. New interactive music systems could help a wider audience to create music, and encourage musical collaborations between friends or strangers. Machine learning and artificial intelligence have already been applied to musical systems: Mobile devices can detect when their owners are running and play music that matches their pace; Jukebox software studies your musical choices to make recommendations that suit or extend your taste. We propose that artificial intelligence could be directed towards predicting musical interactions. By studying how users interact with a sonic interface, we aim to isolate and recreate the elements of musical style as performed through common multi-touch devices.

New musical interfaces with embedded prediction capabilities could help novice performers to express complex musical ideas. They might assist more established musical exploreres engage in creative collaboration with others. Finally, they might even help those that prefer to listen to experience the world around them in a new way, by connecting musical explorations with their everyday activities, movements, and requirements.

A series of 3D-printed miniature synthesisers. Photo.
A series of 3D-printed miniature synthesisers that interact via light sensors and emitters.

Areas of Exploration

These goals can be addressed in a number of ways, but we are focussing on the following three areas for developing interactive musical systems:

  • Musical Apps: Almost everybody uses smartphones and other mobile devices. These computers are packed with sensors, expressive touchscreens, and powerful processors. Finding new ways to make music with mobile apps is a core direction for this research.
  • Musical Social Media: Social Media encourages expression through words and images but why shouldn't social media users make music? We are studying new musical systems for social media users to present musical expressions and to react not just with "likes" or comments, but with new musical expressions that can be layered to form ensemble interactions.
  • Sonic/Computing Objects: We are using rapid prototyping to explore new designs for physical musical objects that include embedded interactive music systems. Such systems may pave the way for physical musical instruments of the future that allow their owners to express themselves and also to collaborate with friends around the world.
A finger pointing to the screen of a smartphone. On the screen you can see a colorful illustration. Photo.
An interactive music app design for sharing short musical ideas in a social media interface.

Technologies and Methodologies

We are addressing the above areas through a number of emerging technologies and research methods. Musical interaction through touch-screens and multi-modal sensors produces a large amount of very low-level data. We are using deep artifical neural networks (DNNs) to model, understand, and predict this data. Using a corpus of more than 4 million individual touch interactions, we are training networks that generate new musical touch patterns and react, like a musical ensemble, to touch-data from a real user. We are also exploring systems that include multiple internal models to handle interaction from users of different skill levels, those who prefer to listen or perform, and multiple touch-screen sounds performed throught a single interface. Our apps and devices will be explored both in-the-wild through iterative experimental releases, and in-the-lab through controlled evaluations of the creative interfaces. We envision that our work with interactive music could lead to multiple public software releases as well as bespoke sonic hardware designs.

 

Tags: music technology, interactive music, prediction, EPEC, Machine Learning By Charles Martin
Published Apr. 3, 2017 4:43 PM - Last modified Sep. 23, 2022 1:14 PM