Abstract
AbstractPredicting the emotional responses of humans to soundscapes is a relatively recent field of research coming with a wide range of promising applications. This work presents the design of two convolutional neural networks, namely ArNet and ValNet, each one responsible for quantifying arousal and valence evoked by soundscapes. We build on the knowledge acquired from the application of traditional machine learning techniques on the specific domain, and design a suitable deep learning framework. Moreover, we propose the usage of artificially created mixed soundscapes, the distributions of which are located between the ones of the available samples, a process that increases the variance of the dataset leading to significantly better performance. The reported results outperform the state of the art on a soundscape dataset following Schafer’s standardized categorization considering both sound’s identity and the respective listening context.
Publisher
Springer Science and Business Media LLC
Subject
Computer Networks and Communications,Hardware and Architecture,Media Technology,Software
Reference28 articles.
1. Berglund B, Nilsson M, Axelsson S (2007) Soundscape psychophysics in place. 6, 3704–3711. Proc. Inter-Noise 2007 2007(p.):IN07114
2. Brocolini L, Waks L, Lavandier C, Marquis-Favre C, Quoy M, Lavandier M (2010) Comparison between multiple linear regressions and artificial neural networks to predict urban sound quality
3. Davies W, Adams M, Bruce N, Cain R, Jennings P, Carlyle A, Cusack P, Hume K, Plack C (2009) A positive soundscape evaluation system
4. Drossos K, Floros A, Giannakoulopoulos A, Kanellopoulos N (2015) Investigating the impact of sound angular position on the listener affective state. IEEE Trans Affect Comput 6(1):27–42. https://doi.org/10.1109/TAFFC.2015.2392768
5. Fan J, Thorogood M, Pasquier P (2017) Emo-soundscapes: a dataset for soundscape emotion recognition. In: 2017 Seventh international conference on affective computing and intelligent interaction (ACII). https://doi.org/10.1109/ACII.2017.8273600, pp 196–201
Cited by
7 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献