Affiliation:
1. Wild Minds Lab, School of Psychology and Neuroscience University of St Andrews St Andrews UK
2. Department of Pedagogy Chubu Gakuin University Gifu Japan
3. Division of the Humanities and Social Sciences California Institute of Technology Pasadena California USA
Abstract
Abstract
Studying animal behaviour allows us to understand how different species and individuals navigate their physical and social worlds. Video coding of behaviour is considered a gold standard: allowing researchers to extract rich nuanced behavioural datasets, validate their reliability, and for research to be replicated. However, in practice, videos are only useful if data can be efficiently extracted. Manually locating relevant footage in 10,000 s of hours is extremely time‐consuming, as is the manual coding of animal behaviour, which requires extensive training to achieve reliability.
Machine learning approaches are used to automate the recognition of patterns within data, considerably reducing the time taken to extract data and improving reliability. However, tracking visual information to recognise nuanced behaviour is a challenging problem and, to date, the tracking and pose‐estimation tools used to detect behaviour are typically applied where the visual environment is highly controlled.
Animal behaviour researchers are interested in applying these tools to the study of wild animals, but it is not clear to what extent doing so is currently possible, or which tools are most suited to particular problems. To address this gap in knowledge, we describe the new tools available in this rapidly evolving landscape, suggest guidance for tool selection, provide a worked demonstration of the use of machine learning to track movement in video data of wild apes, and make our base models available for use.
We use a pose‐estimation tool, DeepLabCut, to demonstrate successful training of two pilot models of an extremely challenging pose estimate and tracking problem: multi‐animal wild forest‐living chimpanzees and bonobos across behavioural contexts from hand‐held video footage.
With DeepWild we show that, without requiring specific expertise in machine learning, pose estimation and movement tracking of free‐living wild primates in visually complex environments is an attainable goal for behavioural researchers.
Funder
Horizon 2020 Framework Programme
Subject
Animal Science and Zoology,Ecology, Evolution, Behavior and Systematics
Cited by
15 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献