The general idea of the LEGOS project is to fertilize interdisciplinary expertise in gesture-controlled sound systems with neurosciences, especially regarding sensori-motor learning. We believe that sensori-motor learning are not sufficiently considered for the development of interactive sound systems. A better understanding of the sensori-motor learning mechanisms involved in gesture-sound coupling should provide us with efficient methodologies for the evaluation and optimization of these interactive systems.
Such advances would significantly expand the usability of today’s gesture-based interactive sound systems, often developed empirically.
- Gesture learning or rehabilitation: the task is to perform a gesture guided by an audio feedback. The sensori-motor learning in this case is assessed in terms of the gesture precision and repeatability
- Movement-based sound control: The task is to produce a given sound through the manipulation of a gestural interface, as in digital musical instruments. The sensori-motor learning is assessed in terms of sound production quality.
- Interactive Sound Design: The task is to manipulate an object or a tangible interface, that is sonified. The sensori-motor learning in this case is assessed through the quality and understanding of the objet manipulation.