This article discusses to adapt ’Pogany’, a tangible cephalomorphic interface designed and realized in LIMSI-CNRS laboratory, to use for music synthesis purposes. We are interested in methods for building an affective emotion-based system for gesture and posture identification, captured through a facial interface that understands variations of luminosity by distance or touch. After a brief discussion on related research, the article introduces issues in the development of a robust gesture learning and recognition tool based on HMMs. Results of the first gesture training and recognition system built are presented and evaluated. Explicit future work is described, as well as further possible improvements concerning the interface, the recognition system and its mapping with a music synthesis tool.