Author:
Civit Miguel,Muñoz-Saavedra Luis,Cuadrado Francisco,Tijus Charles,José Escalona María
Abstract
In this paper, we present a novel framework for the study and design of AI-assisted musical devices (AIMEs). Initially, we present taxonomy of these devices and illustrate it with a set of scenarios and personas. Later, we propose a generic architecture for the implementation of AIMEs and present some examples from the scenarios. We show that the proposed framework and architecture are a valid tool for the study of intelligent musical devices.
Reference36 articles.
1. Turchet L, Fischione C, Essl G, Keller D, Barthet M. Internet of musical things: Vision and challenges. IEEE Access. 2018;6:61994-62017
2. Clark D, Westin F, Girouard A. iSNoW: User perceptions of an interactive social novelty wearable. In: Adjunct Proceedings of the 2019 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2019 ACM International Symposium on Wearable Computers. 2019. pp. 268-271
3. Fiebrink R, Sonami L. Reflections on Eight Years of Instrument Creation with Machine Learning. Goldsmiths, University of London; 2020
4. Buehler-McWilliams K, Murray RE. The monochord in the medieval and modern classrooms. Journal of Music History Pedagogy. 2013;3:151-172
5. Briot JP, Hadjeres G, Pachet FD. Deep Learning Techniques for Music Generation. Springer; 2020