Our aim is to develop a gesture follower for performing arts, which indicates inreal-time the time correspondences between an observed gesture sequence and a fixed reference gesture sequence (or in other words to time warp in real-time theobserved sequence to the learned sequence). Applications concern for example choreography or (composed) music. The possibility to continuously establishcorrespondences between observed and reference gestures has important potentials. For example, this will enable the quantitative comparison of gesture data, at similarmoments, between live and previously recorded performances. Particularly, this should facilitate the computation of parameters related to interpretation andexpression. Moreover, this will also provide new tools to control and dynamically change mapping settings between gesture and digital media.