Author:
Agres Kat R.,Dash Adyasha,Chua Phoebe
Abstract
This work introduces a new music generation system, called AffectMachine-Classical, that is capable of generating affective Classic music in real-time. AffectMachine was designed to be incorporated into biofeedback systems (such as brain-computer-interfaces) to help users become aware of, and ultimately mediate, their own dynamic affective states. That is, this system was developed for music-based MedTech to support real-time emotion self-regulation in users. We provide an overview of the rule-based, probabilistic system architecture, describing the main aspects of the system and how they are novel. We then present the results of a listener study that was conducted to validate the ability of the system to reliably convey target emotions to listeners. The findings indicate that AffectMachine-Classical is very effective in communicating various levels of Arousal (R2 = 0.96) to listeners, and is also quite convincing in terms of Valence (R2 = 0.90). Future work will embed AffectMachine-Classical into biofeedback systems, to leverage the efficacy of the affective music for emotional wellbeing in listeners.
Funder
Agency for Science, Technology and Research
Reference59 articles.
1. MusicLM: Generating music from text;Agostinelli;arXiv preprint arXiv:2301.11325,2023
2. Music, computing, and health: a roadmap for the current and future roles of music technology for health care and well-being;Agres;Music Sci,2021
3. “Music, brain, and health: exploring biological foundations of music's health effects,”;Altenmüller,2012
4. A cross-cultural investigation of the perception of emotion in music: psychophysical and cultural cues;Balkwill;Music Percept,1999
5. Measuring emotion: the self-assessment manikin and the semantic differential;Bradley;J. Behav. Ther. Exp. Psychiatry,1994
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献