IRCAM Real-Time Musical Interactions
The IMTR Team conducts research and development on interactive music systems, gesture and sound modeling, interactive music synthesis, gesture capture systems and interfaces. The applications cover music performance and more generally all the performing arts. The use of digital techniques can be seen as augmenting or creating new performer’s instruments – transforming sound, voice, gestures, memory - creating dialogues between artists and digital media.
- The Interlude project, coordinated by our team won the ANR Award: "Prix ANR du Numérique 2013 - Impact sociétal"
MO - Modular Musical Objects from the Interlude project
Urban Musical Game at Agora and Futur en Seine Festival - 2011
Mogees - Real-time gesture recogntion with contact microphones for synthesis control
Interactive descriptor-based synthesis of a wind sound texture using corpus-based concatenative synthesis