An End-to-End Musical Instrument System That Translates Electromyogram Biosignals to Synthesized Sound

Author:

Tanaka Atau1,Visi Federico2,Donato Balandino Di3,Klang Martin4,Zbyszyński Michael5

Affiliation:

1. Department of Computing Goldsmiths University of London London SE14 6NW, United Kingdom a.tanaka@gold.ac.uk

2. Fakultät Gestaltung, Universität der Künste Berlin School of Music in Piteå, Luleå University of Technology mail@federicovisi.com

3. School of Computing Engineering and the Built Environment Edinburgh Napier University 10 Colinton Road, Edinburgh EH10 5DT United Kingdom B.DiDonato@napier.ac.uk

4. Rebel Technology C Sant Isidre 38, 08912 Badalona, Spain martin@rebeltech.org

5. L-Acoustics 67 Southwood Lane, London, UK z@mikezed.com

Abstract

Abstract This article presents a custom system combining hardware and software that senses physiological signals of the performer's body resulting from muscle contraction and translates them to computer-synthesized sound. Our goal was to build upon the history of research in the field to develop a complete, integrated system that could be used by nonspecialist musicians. We describe the Embodied AudioVisual Interaction Electromyogram, an end-to-end system spanning wearable sensing on the musician's body, custom microcontroller-based biosignal acquisition hardware, machine learning–based gesture-to-sound mapping middleware, and software-based granular synthesis sound output. A novel hardware design digitizes the electromyogram signals from the muscle with minimal analog preprocessing and treats it in an audio signal-processing chain as a class-compliant audio and wireless MIDI interface. The mapping layer implements an interactive machine learning workflow in a reinforcement learning configuration and can map gesture features to auditory metadata in a multidimensional information space. The system adapts existing machine learning and synthesis modules to work with the hardware, resulting in an integrated, end-to-end system. We explore its potential as a digital musical instrument through a series of public presentations and concert performances by a range of musical practitioners.

Publisher

MIT Press

Reference67 articles.

1. Towards Gestural Sonic Affordances;Altavilla;Proceedings of the International Conference on New Interfaces for Musical Expression,2013

2. Expressiveness and Digital Musical Instrument Design;Arfib;Journal of New Music Research,2005

3. Strategies of Mapping between Gesture Data and Synthesis Model Parameters Using Perceptual Spaces;Arfib;Organised Sound,2002

4. A Practical EMG-Based Human-Computer Interface for Users with Motor Disabilities;Barreto;Journal of Rehabilitation Research and Development,2000

5. Musical Instruments in the 21st Century

Cited by 1 articles. 订阅此论文施引文献 订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3