Résumé |
The subject of this master thesis is to adapt ”Pogany”, a tangible cephalomorphic interface designed and realized in LIMSI-CNRS laboratory, Orsay-France, to use for music synthesis purposes. We are interested in methods for building an affective emotion-based system for gesture identification, captured through a facial interface that understands variations of luminosity by distance or touch. After a brief discussion on related research, the report introduces issues in the conceptualization and development of a gesture learning and real-time recognition tool based on HMM theory and technology. Employing direct features and mapping techniques, but also giving priority to the importance of high-level information arising from user’s gesture, we have built a system for expressive music performance. In this report, gesture training and recognition system used in this scope are thoroughly presented and evaluated. In order to evaluate the interface’s overall performance as a music expressive medium, we then describe an experiment with several subjects interacting musically with Pogany and discuss our results. Finally, we give clues for explicit future work, as well as further possible improvements concerning the interface, the recognition system and its mapping with a music synthesis tool. |