Affiliation:
1. Department of Computer Science and Information Systems, University of Limerick
Abstract
In this paper we approach the subject of modelling and understanding segmentation processes in melodic perception using a temporal multi-scale representation framework. We start with the hypothesis that segmentation depends on the ability of the perceptual system to detect changes in the sensory signal. In particular, we are interested in a model of change detection in music perception that would help us to investigate functional aspects of low-level perceptual processes in music and their universality in terms of the general properties of the auditory system. To investigate this hypothesis, we have developed a temporal multi-scale model that mimics the ability of the listener to detect changes in pitch, loudness and timbre when listening to performed melodies. The model is set within the linear scale-space theoretical framework, as developed for image structure analysis but in this case applied to the temporal processing domain. It is structured in such a way as to enable us to verify the assumption that segmentation is influenced by both the dynamics of signal propagation through a neural map and learning and attention factors. Consequently, the model is examined from two perspectives: 1) the computational architecture which models signal propagation is examined for achieving the effects of the universal, inborn aspects of segmentation 2) the model structure capable of influencing choices of segmentation outcomes is explained and some of its effects are examined in view of the known segmentation results. The results of the presented case studies demonstrate that the model accounts for some effects of perceptual organization of the sensory signal and provides a sound basis for analysing different types of changes and coordination across the melodic descriptors in segmentation decisions.
Subject
Music,Experimental and Cognitive Psychology