The virtual bath tub is an interactive virtual surface in the air the user can interact with, as if there were a bath tub filled will water in front of him.
Inspired from foley artists issues to manipulate water, this scenario allows playing with splash and underwater sound sequences related to the energy of the movement.
Conception: Eric O. Boyer.
Motion capture with Leap Motion Device.
Sound synthesis made with MuBu objects (www.ismm.ircam.fr).
Sound materials by Roland Cahen and Diemo Schwarz (Topophonie Project).
The paper Low-cost motion sensing of table tennis players for real time feedback by Eric Boyer, Frédéric Bevilacqua, François Phal and Sylvain Hanneton has been published in the International Journal of Tennis Table no 8. This publication follows the presentation at the Sport Science Congress
3 contributions accepted at the 21st ACM international conference on Multimedia in Barcelona, Spain. These works describe our current approach and technology we are developing to build motion-sound interactive systems. These systems should favor the building of an action-perception loop (motion-sound) and should facilitate the sensori-motor learning, which will be evaluated next.
Jules Françoise won the Best Doctoral Symposium Paper (see below)
- J. Françoise, N. Schnell and F. Bevilacqua, “A Multimodal Probabilistic Model for Gesture–based Control of Sound Synthesis“, in Proceedings of the 21st ACM International Conference on Multimedia, MM ’13, (New York, NY, USA), pp. 705–708, ACM, 2013.
- J. Françoise, N. Schnell, and F. Bevilacqua, “Gesture-based control of physical modeling sound synthesis: A mapping-by-demonstration approach“, in Proceedings of the 21st ACM International Conference on Multimedia, MM ’13, (New York, NY, USA), pp. 447–448, ACM, 2013.
2 Papers accepted for the conference Computer Muisc Multidisciplinary Research 2013 : Sound, Music & Motion.
We presented two papers at the Multimodal Motor Behaviour : Impact of Sound conference, organized by Leibniz Unversität in Hannover and the ETH Institute of Thechnology of Zurich; which took place 09.31 and 10.01 in Hannover.
- E. O. Boyer, L. Colin Malagon, F. Bevilacqua, P. Susini and S. Hanneton, “Continuous sound feedback in tracking tasks“,
presenting an ongoing work comparing different sonification mappings and their contribution to a 2D visual tracking task.
- F. Bevilacqua, A. Vanzandt-Escobar, N. Schnell, E. O. Boyer, N. Rasamimanana, S. Hanneton and A. Roby-Brami, “Sonification of the coordination of arm movements“,
introducing the primary work on hemiparesis rehabilitation with real-time sound feedback.
- Screenshot of a demo video showing a prototype to simulate hemiparetic patients.
Summer School 2013 Human Computer Confluence, http://hcsquared.eu
Workshop From everyday objects to sonic interaction design
July 17-19 2013, Ircam, PDS and IMTR research teams (collaboration with Goldsmiths-London)
Lauren Hayes – PhD student in creative music practice
Emmanouil Giannisakis – Master student in digital media engineering
Jaime Arias Almeida – PhD student in informatics
Alberto Betella – PhD student in communication, information and audiovisual media
David Hofmann Phd student in theoretical neuroscience
Paper accepted for the conference 13th International Table Tennis Federation Sport Science Congress in Paris, May 12-13 2013
A potential application of sonification for learning specific Table Tennis gestures will be explained and demonstrated.
E. Boyer, F. Bevilacqua, F. Phal, and S. Hanneton, “Low-cost motion sensing of table tennis players for real time feedback”,13th International Table Tennis Federation Sport Science Congress in Paris, May 12-13 2013.
Two posters are accepted for the conference Progress in Motor Control IX meeting to be held at McGill University in Montreal, July 14-16 2013.
- E. Boyer, Q. Pyanet, S. Hanneton, and F. Bevilacqua, “Sensorimotor adaptation to a gesture-sound mapping perturbation”
- S. Hanneton, E. Boyer, and V. Forma, “Influence of an error-related auditory feedback on the adaptation to a visuo-manual perturbation”
The following poster was presented during the inauguration day of the LABEX SMART, March 26 2013
It reports on results of the Legos project and on the collaboration between the STMS Lab IRCAM-CNRS-UPMC and ISIR-UPMC
E. O. Boyer, F. Bevilacqua, S. Hanneton, A. Roby-Brami, S. Hanneton, J.Francoise, N. Schnell, O. Houix, N. Misdariis1, P. Susini and I. Viaud-Delmon, Sensorimotor Learning in Gesture-Sound Interactive Systems
Eric O. Boyer1, 2*
Bénédicte M. Babayan3
- 1 CNRS UMR 9912 IRCAM, France
- 2Laboratoire de Neurophysique et Physiologie, CNRS UMR 8119, Université Paris Descartes UFR Biomédicale des Saints Pères, France
- 3Neurobiologie des Processus Adaptatifs, CNRS UMR 7102, UPMC, France
- 4Institut des systèmes intelligents et de robotique (ISIR) CNRS UMR 7222, UPMC, France
Studies of the nature of the neural mechanisms involved in goal-directed movements tend to concentrate on the role of vision. We present here an attempt to address the mechanisms whereby an auditory input is transformed into a motor command. The spatial and temporal organization of hand movements were studied in normal human subjects as they pointed towards unseen auditory targets located in a horizontal plane in front of them. Positions and movements of the hand were measured by a six infrared camera tracking system. In one condition, we assessed the role of auditory information about target position in correcting the trajectory of the hand. To accomplish this, the duration of the target presentation was varied. In another condition, subjects received continuous auditory feedback of their hand movement while pointing to the auditory targets. Online auditory control of the direction of pointing movements was assessed by evaluating how subjects reacted to shifts in heard hand position. Localization errors were exacerbated by short duration of target presentation but not modified by auditory feedback of hand position. Long duration of target presentation gave rise to a higher level of accuracy and was accompanied by early automatic head orienting movements consistently related to target direction. These results highlight the efficiency of auditory feedback processing in online motor control and suggest that the auditory system takes advantages of dynamic changes of the acoustic cues due to changes in head orientation in order to process online motor control. How to design an informative acoustic feedback needs to be carefully studied to demonstrate that auditory feedback of the hand could assist the monitoring of movements directed at objects in auditory space.