Résumé |
General methodologies for analyzing music — even structuralist ones — implicitly rely on perceptual principles. Indeed, music cannot be thoroughly understood without an appreciation of its communicative value. In fact, all limitations encountered by contemporary approaches of automated musical pattern discovery stem from an insufficient consideration for perception. It would be of high benefit, therefore, to develop a computational approach of automated music analysis based on a cognitive modeling of music perception. This first step towards a cognitive understanding of musical pattern perception aims at conceiving a general cognitive system that is able to produce expected results without combinatorial explosion. A new general methodology for Musical Pattern Discovery is proposed, which tries to mimic the flow of cognitive and sub-cognitive inferences that are processed when hearing a piece of music. Patterns have to be discovered along the branch of a syntagmatic graph, which generalizes the syntagmatic chain for polyphonic context. A musical pattern class is defined as a set of characteristics that are approximately shared by different pattern occurrences within the score. Moreover, pattern occurrence not only relies on internal sequence properties, but also on external context. Onto the score is build pattern occurrence chains which themselves interface with pattern class chains. Pattern classes may be inter-associated, in order to formalize relations of inclusion or repetition. The implemented algorithm is able to discover pertinent patterns, even when occurrences are, as in everyday music, translated, slightly distorted, slowed or fastened. Such an understanding of music perception agrees with subjective experience. Such a computer modeling may offer to musicology a detailed and explicit understanding of music, and may suggest to cognitive science the necessary conditions for a virtual perception of musical pattern. |