-
(2020).
Evolutionary algorithms for intelligent robots.
-
(2019).
Kunstig intelligens for tilpasningsdyktige roboter
.
-
Martin, Charles Patrick & Torresen, Jim
(2019).
An Interactive Music Prediction System with Mixture Density Recurrent Neural Networks.
-
Næss, Torgrim Rudland; Tørresen, Jim & Martin, Charles Patrick
(2019).
A Physical Intelligent Instrument using Recurrent Neural Networks.
-
Martin, Charles Patrick; Næss, Torgrim Rudland; Faitas, Andrei & Baumann, Synne Engdahl
(2019).
Session on Musical Prediction and Generation with Deep Learning.
-
Faitas, Andrei; Baumann, Synne Engdahl; Torresen, Jim & Martin, Charles Patrick
(2019).
Generating Convincing Harmony Parts with Simple Long Short-Term Memory Networks.
-
Miseikis, Justinas; Brijacak, Inka; Yahyanejad, Saeed; Glette, Kyrre; Elle, Ole Jacob & Tørresen, Jim
(2019).
Two-Stage Transfer Learning for Heterogeneous Robot Detection and 3D Joint Position Estimation in a 2D Camera Image Using CNN.
-
Tørresen, Jim
(2019).
Making Robots Adaptive and Preferable to Humans.
-
Tørresen, Jim
(2019).
Kunstig intelligens – hvem, hva og hvor.
(Eng. Artificial Intelligence – who, what and where).
-
Ellefsen, Kai Olav
(2019).
Hva Kan Roboter Lære av Biologisk Liv?
-
Tørresen, Jim; Glette, Kyrre & Ellefsen, Kai Olav
(2019).
Intelligent, Adaptive Robots in Real-World Scenarios.
-
Becker, Artur; Herrebrøden, Henrik; Gonzalez Sanchez, Victor Evaristo; Nymoen, Kristian; Dal Sasso Freitas, Carla Maria & Tørresen, Jim
[Show all 7 contributors for this article]
(2019).
Functional Data Analysis of Rowing Technique Using Motion Capture Data.
Show summary
We present an approach to analyzing the motion capture data ofrowers using bivariate functional principal component analysis(bfPCA). The method has been applied on data from six elite rowersrowing on an ergometer. The analyses of the upper and lower bodycoordination during the rowing cycle revealed significant differences between the rowers, even though the data was normalized toaccount for differences in body dimensions. We make an argumentfor the use of bfPCA and other functional data analysis methods forthe quantitative evaluation and description of technique in sports.
-
Ellefsen, Kai Olav; Huizinga, Joost & Tørresen, Jim
(2019).
Guiding Neuroevolution with Structural Objectives.
-
Nygaard, Tønnes Frostad; Nordmoen, Jørgen Halvorsen; Ellefsen, Kai Olav; Martin, Charles Patrick; Tørresen, Jim & Glette, Kyrre
(2019).
Experiences from Real-World Evolution with DyRET: Dynamic Robot for Embodied Testing.
-
Teigen, Bjørn Ivar; Ellefsen, Kai Olav & Tørresen, Jim
(2019).
A Categorization of Reinforcement Learning Exploration Techniques Which Facilitates Combination
of Different Methods.
-
Ellefsen, Kai Olav & Tørresen, Jim
(2019).
Self-Adapting Goals Allow Transfer of Predictive Models to New Tasks.
-
Nordmoen, Jørgen Halvorsen; Nygaard, Tønnes Frostad; Ellefsen, Kai Olav & Glette, Kyrre
(2019).
Evolved embodied phase coordination enables robust quadruped robot locomotion
.
-
Tørresen, Jim
(2019).
Design and Control of Robots for Real-World Environment.
-
Tørresen, Jim
(2019).
Artificial Intelligence and Applications in Health and Care
.
-
Tørresen, Jim
(2019).
Sensing Human State with Application in Older People Care and Mental Health Treatment.
-
Tørresen, Jim
(2019).
Hva er kunstig intelligens?
-
Tørresen, Jim
(2019).
Supporting Older People with Robots for Independent Living.
Show summary
247697
288285
262762
-
Tørresen, Jim
(2019).
Future and Ethical Perspectives of Robotics and AI.
-
Ellefsen, Kai Olav & Tørresen, Jim
(2019).
Evolutionary Robotics: Automatic design of robot bodies and control.
-
Tørresen, Jim; Glette, Kyrre & Ellefsen, Kai Olav
(2019).
Adaptive Robot Body and Control for Real-World Environments.
-
Miura, Jun & Tørresen, Jim
(2019).
Intelligent Robot Technologies for Care and Lifestyle Support .
-
Nordmoen, Jørgen Halvorsen & Fadelli, Ingrid
(2019).
A new method to enable robust locomotion in a quadruped robot.
[Internet].
TechXplore.
-
Nygaard, Tønnes Frostad; Nordmoen, Jørgen Halvorsen; Martin, Charles Patrick; Tørresen, Jim & Glette, Kyrre
(2019).
Lessons Learned from Real-World Experiments with
DyRET: the Dynamic Robot for Embodied Testing.
-
Martin, Charles Patrick & Tørresen, Jim
(2019).
An Interactive Musical Prediction System with Mixture Density Recurrent Neural Networks.
-
Nygaard, Tønnes Frostad; Martin, Charles Patrick; Tørresen, Jim & Glette, Kyrre
(2019).
Self-Modifying Morphology Experiments with DyRET: Dynamic Robot for Embodied Testing.
-
Tørresen, Jim
(2019).
Intelligent Robots and Systems in Real-World Environment.
-
-
Rohlfing, Katharina J. & Tørresen, Jim
(2019).
Explainability: an interactive view.
-
Comba, Joao Luiz Dihl & Tørresen, Jim
(2019).
Visual Data Analysis of Unstructured and Big Data.
-
Glette, Kyrre; Nygaard, Tønnes Frostad & Vogt, Yngve
(2019).
Her er universitetets nest selvlærende robot.
[Business/trade/industry journal].
Teknisk ukeblad.
-
(2019).
Kunstig intelligens for tilpasningsdyktige roboter.
-
Tørresen, Jim
(2019).
Intelligent and Adaptive Robots in Real-World Environment.
Show summary
240862
259293
247697
288285
262762
-
Martin, Charles Patrick; Glette, Kyrre; Nygaard, Tønnes Frostad & Tørresen, Jim
(2018).
Self-Awareness in a Cyber-Physical Predictive Musical Interface.
Show summary
We introduce a new self-contained and self-aware interface for musical expression where a recurrent neural network (RNN) is integrated into a physical instrument design. The system includes levers for physical input and output, a speaker system, and an integrated single-board computer. The RNN serves as an internal model of the user’s physical input, and predictions can replace or complement direct sonic and physical control by the user. We explore this device in terms of different interaction configurations and learned models according to frameworks of self-aware cyber-physical systems.
-
Tørresen, Jim; Garcia Ceja, Enrique Alejandro; Ellefsen, Kai Olav & Martin, Charles Patrick
(2018).
Equipping Systems with Forecasting Capabilities .
-
-
Garcia Ceja, Enrique Alejandro; Ellefsen, Kai Olav; Martin, Charles Patrick & Tørresen, Jim
(2018).
Prediction, Interaction, and User Behaviour.
Show summary
The goal of this tutorial is to apply predictive machine learning models to human behaviour through a human computer interface. We will introduce participants to the key stages for developing predictive interaction in user-facing technologies: collecting and identifying data, applying machine learning models, and developing predictive interactions. Many of us are aware of recent advances in deep neural networks (DNNs) and other machine learning (ML) techniques; however, it is not always clear how we can apply these techniques in interactive and real-time applications. Apart from well-known examples such as image classification and speech recognition, what else can predictive ML models be used for? How can these computational intelligence techniques be deployed to help users?
In this tutorial, we will show that ML models can be applied to many interactive applications to enhance users’ experience and engagement. We will demonstrate how sensor and user interaction data can be collected and investigated, modelled using classical ML and DNNs, and where predictions of these models can feed back into an interface. We will walk through these processes using live-coded demonstrations with Python code in Jupyter Notebooks so participants will be able to see our investigations live and take the example code home to apply in their own projects.
Our demonstrations will be motivated from examples from our own research in creativity support tools, robotics, and modelling user behaviour. In creativity, we will show how streams of interaction data from a creative musical interface can be modelled with deep recurrent neural networks (RNNs). From this data, we can predict users’ future interactions, or the potential interactions of other users. This enables us to “fill in” parts of a tablet-based musical ensemble when other users are not available, or to continue a user’s composition with potential musical parts. In user behaviour, we will show how smartphone sensor data can be used to infer user contextual information such as physical activities. This contextual information can be used to trigger interactions in smart home or internet of things (IoT) environments, to help tune interactive applications to user’s needs, or to help track health data.
-
-
-
-
-
-
Martin, Charles Patrick
(2018).
MicroJam.
Show summary
MicroJam is a mobile app for sharing tiny touch-screen performances. Mobile applications that streamline creativity and social interaction have enabled a very broad audience to develop their own creative practices. While these apps have been very successful in visual arts (particularly photography), the idea of social music-making has not had such a broad impact. MicroJam includes several novel performance concepts intended to engage the casual music maker and inspired by current trends in social creativity support tools. Touch-screen performances are limited to 5-seconds, instrument settings are posed as sonic "filters", and past performances are arranged as a timeline with replies and layers. These features of MicroJam encourage users not only to perform music more frequently, but to engage with others in impromptu ensemble music making.
-
Tørresen, Jim
(2018).
Ethical Robots and Autonomous Systems.
-
Tørresen, Jim
(2018).
UiO Visit to UFRJ – An overview of research .
-
Tørresen, Jim
(2018).
Artificial Intelligence Applied for Real-World Systems.
-
Martin, Charles Patrick & Tørresen, Jim
(2018).
Predictive Musical Interaction with MDRNNs.
-
Tørresen, Jim
(2018).
Kunstig intelligens – hvem, hva og hvor.
-
Tørresen, Jim
(2018).
Frelsende eller fatalt?
[Business/trade/industry journal].
Forskningsetikk.
-
Stoica, Adrian & Tørresen, Jim
(2018).
Robots on the Moon, and their Role in a Future Lunar Economy .
-
Tørresen, Jim
(2018).
Roboter kommer nærmere – skal vi glede eller grue oss?
-
Tørresen, Jim
(2018).
Artificial Intelligence – State-of-the-art.
-
Martin, Charles Patrick; Jensenius, Alexander Refsum & Tørresen, Jim
(2018).
Composing an ensemble standstill work for Myo and Bela.
Show summary
This paper describes the process of developing a standstill performance work using the Myo gesture control armband and the Bela embedded computing platform. The combination of Myo and Bela allows a portable and extensible version of the standstill performance concept while introducing muscle tension as an additional control parameter. We describe the technical details of our setup and introduce Myo-to-Bela and Myo-to-OSC software bridges that assist with prototyping compositions using the Myo controller.
-
Jensenius, Alexander Refsum; Martin, Charles Patrick; Bjerkestrand, Kari Anne Vadstensvik & Johnson, Victoria
(2018).
Stillness under Tension.
-
Martin, Charles Patrick; Xambó, Anna; Visi, Federico; Morreale, Fabio & Jensenius, Alexander Refsum
(2018).
Stillness under Tension.
Show summary
Stillness Under Tension is an ensemble standstill work for Myo gesture control armband and Bela embedded music platform. Humans are incapable of standing completely still due to breathing and other involuntary micromotions. This work explores the expressive space of standing still through an inverse action-sound mapping: less movement leads to more sound. Four performers stand as still as possible on stage, each wearing a Myo armband connected to a Bela embedded sound processing platform. The Myo is used to measure the performers movement, and the muscle activity in their forearm which they can use--both voluntarily and involuntarily--to control a synthesised sound world. Each performer uses one Myo and Bela in a musical space defined by their physical position and posture while standing still.
-
Tørresen, Jim
(2018).
Remote Lab and Applications for High Performance and Embedded Architectures.
-
Tørresen, Jim
(2018).
Kunstig Intelligens – Lærende og tilpasningsdyktig teknologi.
-
Ellefsen, Kai Olav
(2018).
Evolusjonær Robotikk: Automatisk design og kontroll av roboter.
-
Søyseth, Vegard Dønnem; Nygaard, Tønnes Frostad; Martin, Charles Patrick; Uddin, Md Zia & Ellefsen, Kai Olav
(2018).
ROBIN-Stand ved Cutting Edge 2018.
-
Martin, Charles Patrick
(2018).
Deep Predictive Models in Interactive Music.
-
Næss, Torgrim Rudland; Martin, Charles Patrick & Tørresen, Jim
(2019).
A Physical Intelligent Instrument using Recurrent Neural Networks.
Universitetet i Oslo.
-
Wallace, Benedikte & Martin, Charles Patrick
(2018).
Predictive songwriting with concatenative accompaniment.
Universitetet i Oslo.
-
Tørresen, Jim; Teigen, Bjørn Ivar & Ellefsen, Kai Olav
(2018).
An Active Learning Perspective on Exploration in Reinforcement Learning.
Universitetet i Oslo.
-
Fjeld, Matias Hermanrud & Tørresen, Jim
(2018).
3D Spatial Navigation in Octrees with Reinforcement Learning.
Universitetet i Oslo.
-