The paper Low-cost motion sensing of table tennis players for real time feedback by Eric Boyer, Frédéric Bevilacqua, François Phal and Sylvain Hanneton has been published in the International Journal of Tennis Table no 8. This publication follows the presentation at the Sport Science Congress
3 contributions accepted at the 21st ACM international conference on Multimedia in Barcelona, Spain. These works describe our current approach and technology we are developing to build motion-sound interactive systems. These systems should favor the building of an action-perception loop (motion-sound) and should facilitate the sensori-motor learning, which will be evaluated next.
Jules Françoise won the Best Doctoral Symposium Paper (see below)
- J. Françoise, N. Schnell and F. Bevilacqua, “A Multimodal Probabilistic Model for Gesture–based Control of Sound Synthesis“, in Proceedings of the 21st ACM International Conference on Multimedia, MM ’13, (New York, NY, USA), pp. 705–708, ACM, 2013.
- J. Françoise, N. Schnell, and F. Bevilacqua, “Gesture-based control of physical modeling sound synthesis: A mapping-by-demonstration approach“, in Proceedings of the 21st ACM International Conference on Multimedia, MM ’13, (New York, NY, USA), pp. 447–448, ACM, 2013.
- J. Françoise, “Gesture–sound mapping by demonstration in interactive music systems” inProceedings of the 21st ACM International Conference on Multimedia, MM ’13, (New York, NY, USA), pp. 1051–1054, ACM, 201
- 1 CNRS UMR 9912 IRCAM, France
- 2Laboratoire de Neurophysique et Physiologie, CNRS UMR 8119, Université Paris Descartes UFR Biomédicale des Saints Pères, France
- 3Neurobiologie des Processus Adaptatifs, CNRS UMR 7102, UPMC, France
- 4Institut des systèmes intelligents et de robotique (ISIR) CNRS UMR 7222, UPMC, France
Studies of the nature of the neural mechanisms involved in goal-directed movements tend to concentrate on the role of vision. We present here an attempt to address the mechanisms whereby an auditory input is transformed into a motor command. The spatial and temporal organization of hand movements were studied in normal human subjects as they pointed towards unseen auditory targets located in a horizontal plane in front of them. Positions and movements of the hand were measured by a six infrared camera tracking system. In one condition, we assessed the role of auditory information about target position in correcting the trajectory of the hand. To accomplish this, the duration of the target presentation was varied. In another condition, subjects received continuous auditory feedback of their hand movement while pointing to the auditory targets. Online auditory control of the direction of pointing movements was assessed by evaluating how subjects reacted to shifts in heard hand position. Localization errors were exacerbated by short duration of target presentation but not modified by auditory feedback of hand position. Long duration of target presentation gave rise to a higher level of accuracy and was accompanied by early automatic head orienting movements consistently related to target direction. These results highlight the efficiency of auditory feedback processing in online motor control and suggest that the auditory system takes advantages of dynamic changes of the acoustic cues due to changes in head orientation in order to process online motor control. How to design an informative acoustic feedback needs to be carefully studied to demonstrate that auditory feedback of the hand could assist the monitoring of movements directed at objects in auditory space.