Affiliation:
1. Music Perception and Cognition Laboratory, McGill University, Schulich School of Music, Montreal, Quebec, Canada
2. Department of Psychology, McGill University, Montreal, Quebec, Canada
Abstract
In the investigation of musical features that influence musical affect, timbre has received relatively little attention. Investigating affective timbres as they vary between instrument families can lead to inconsistent results, because one instrument family can produce a wide variety of timbres. Here, we consider timbre descriptors, as fine-grained acoustic representations of a sound. Using identical methods, we re-analyzed and synthesized results from three previously published studies: Eerola et al. (2012, Mus. Percept.), McAdams et al. (2017, Front. Psychol.), and Korsmit et al. (2023, Front. Psychol.). In doing so, we aimed to reveal robust timbre descriptors that consistently predict the affective response and to explain any discrepancies in results arising from differences in experimental methodology. We computed spectral, temporal, and spectro-temporal descriptors from all stimuli and used these to predict the affect ratings using linear and nonlinear methods. Our most consistent finding was that the fundamental frequency or higher-frequency energy of a sound predicted pleasant affect (i.e., positive valence, happiness, sadness) in one direction and unpleasant affect (i.e., tension, anger, fear) in the opposite direction. Clear discrepancies in previous findings may be attributable to differences in experimental design. When pitch variation was present in a stimulus set, energy arousal was predicted by pitch and inharmonicity, whereas when attack variation was present in the stimulus set, energy arousal was predicted by a faster attack and shorter sustain.
Funder
Prins Bernhard Cultuurfonds
Canada Research Chair
Social Sciences and Humanities Research Council Partnership Grant
Canadian Social Sciences and Humanities Research Council Insight Grant
Reference36 articles.
1. Ben-Shachar M. S., Makowski D., Lüdecke D., Patil I., Wiernik B. M., Kelley K., Stanley D., Burnett J., Karreth J. (2022). effectsize: Indices of Effect Size (0.8.2). https://CRAN.R-project.org/package=effectsize
2. A random forest guided tour
3. Multidimensional scaling of emotional responses to music: The effect of musical expertise and of the duration of the excerpts
4. A sawtooth waveform inspired pitch estimator for speech and music