Résumé |
This paper presents ongoing work on gesture mapping strategies and applications to sound synthesis
by signal models controlled via a standard MIDI wind controller. Our approach consists in
considering different mapping strategies in order to achieve "fine" (therefore in the authors'
opinion, potentially expressive) control of additive synthesis by coupling originally independent
outputs from the wind controller. These control dignals are applied to nine different clarinet data
files obtained from analysis of clarinet sounds, which are arranged in an expressive timbral subspace
and interpolated in real-time, using FTS 1.4, IRCAM's digital signal processing environment. An analysis
of the resulting interpolation is also provided and topics related to sound morphing are discussed.
|