Abstract
AbstractBrain and behavioural asymmetries have been documented in various taxa. Many of these asymmetries involve preferential left- and right-eye use. However, measuring eye use through manual frame-by-frame analyses from video recordings is laborious and may lead to biases. Recent progress in technology allowed the development of accurate tracking techniques for measuring animal behaviour. Amongst these techniques, DeepLabCut, a python-based tracking toolbox using transfer learning with deep neural networks, offers the possibility to track different body parts with unprecedented accuracy. Exploiting the potentialities of DeepLabCut, we developed ‘Visual Field Analysis’, an additional open-source application for extracting eye-use data. To our knowledge, this is the first application that can automatically quantify left-right preferences in eye use. Here we test the performance of our application in measuring preferential eye-use in young domestic chicks. The comparison with manual scoring methods revealed a perfect correlation in the measures of eye-use obtained by ‘Visual Field Analysis’. With our application, eye-use can be analysed reliably, objectively and at a fine scale in different experimental paradigms.
Publisher
Cold Spring Harbor Laboratory
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献