Author:
Bella Simone Dalla,Janaqi Stefan,Benoit Charles-Etienne,Farrugia Nicolas,Bégel Valentin,Verga Laura,Harding Eleanor E.,Kotz Sonja A.
Abstract
AbstractHumans can easily extract the rhythm of a complex sound, like music, and move to its regular beat, for example in dance. These abilities are modulated by musical training and vary significantly in untrained individuals. The causes of this variability are multidimensional and typically hard to grasp with single tasks. To date we lack a comprehensive model capturing the rhythmic fingerprints of both musicians and non-musicians. Here we harnessed machine learning to extract a parsimonious model of rhythmic abilities, based on the behavioral testing (with perceptual and motor tasks) of individuals with and without formal musical training (n= 79). We demonstrate that the variability of rhythmic abilities, and their link with formal and informal music experience, can be successfully captured by profiles including a minimal set of behavioral measures. These profiles can shed light on individual variability in healthy and clinical populations, and provide guidelines for personalizing rhythm-based interventions.
Publisher
Cold Spring Harbor Laboratory
Reference111 articles.
1. Contrastive machine learning reveals the structure of neuroanatomical variation within autism;Science,2022
2. Music, Computing, and Health: A Roadmap for the Current and Future Roles of Music Technology for Health Care and Well-Being;. Music & Science,2021
3. Temporal Control of Movements in Sensorimotor Synchronization
4. The role of musical training in emergent and event-based timing;Frontiers in Human Neuroscience,2013
5. Rhythm synchronization performance and auditory working memory in early- and late-trained musicians
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献