Gestural Auditory and Visual Interactive Platform


The project was initiated by Marc Rébillat (LIMSI-CNRS, École polytechnique) and gathers Tifanie Bouchara (LIMSI-CNRS), Geatan Parseihian (LIMSI-CNRS), Sarah Fdili Alaoui (LIMSI-CNRS, Ircam-CNRS), Matthieu Courgeon (LIMSI-CNRS) and Baptiste Caramiaux (Ircam-CNRS).


The objective of GAVIP is the conception and the exploitation of an immersive, interactive and multimodal platform. Such a platform is built as a support for different scenarios. These scenarios will typically plunge the participants into a virtual environment populated of several audio-graphical entities among which they can evolve freely. Participants can interact in real-time with these entities through their gestures. Gesture analysis is used as a natural way to control the behaviour of audio-graphical entities and their spatialization in a 3D scene. Thus, a great effort is made to design a general architecture that allows for wide interaction strategies.

Current Implementation

  • Full version uses WFS, Stereoscopic Graphical Rendering, Sound synthesis by CataRT, Optitrack Motion Capture System, and Accelerometer based controller.
  • Light version uses Binaural, 3D Graphical Rendering, Sound synthesis by CataRT, and Accelerometer based controller.
Personal tools