IRCAM Real-Time Musical Interactions


This is obsolete ! Please visit the new webpage Sound Music Movement Interaction team

The IMTR Team conducts research and development on interactive music systems, gesture and sound modeling, interactive music synthesis, gesture capture systems and interfaces. The applications cover music performance and more generally all the performing arts. The use of digital techniques can be seen as augmenting or creating new performer’s instruments – transforming sound, voice, gestures, memory - creating dialogues between artists and digital media.


MO - Modular Musical Objects from the Interlude project

Urban Musical Game at Agora and Futur en Seine Festival - 2011

DIRTI — Dirty Tangible Interfaces teaser for CHI 2013

DIRTI for iPad — Les petits chercheurs de sons

Mogees - Real-time gesture recogntion with contact microphones for synthesis control

Interactive descriptor-based synthesis of a wind sound texture using corpus-based concatenative synthesis

Personal tools