Author:
Vasas Vera,Lowell Mark C.,Villa Juliana,Jamison Quentin D.,Siegle Anna G.,Katta Pavan Kumar Reddy,Bhagavathula Pushyami,Kevan Peter G.,Fulton Drew,Losin Neil,Kepplinger David,Salehian Shakiba,Forkner Rebecca E.,Hanley Daniel
Abstract
AbstractPlants, animals, and fungi display a rich tapestry of colors. Animals, in particular, use colors in dynamic displays performed in spatially complex environments. In such natural settings, light is reflected or refracted from objects with complex shapes that cast shadows and generate highlights. In addition, the illuminating light changes continuously as viewers and targets move through heterogeneous, continually fluctuating, light conditions. Although traditional spectrophotometric approaches for studying colors are objective and repeatable, they fail to document this complexity. Worse, they miss the temporal variation of color signals entirely. Here, we introduce hardware and software that provide ecologists and filmmakers the ability to accurately record animal-perceived colors in motion. Specifically, our Python codes transform photos or videos into perceivable units (quantum catches) for any animal of known photoreceptor sensitivity. We provide the plans, codes, and validation tests necessary for end-users to capture animal-view videos. This approach will allow ecologists to investigate how animals use colors in dynamic behavioral displays, the ways natural illumination alters perceived colors, and other questions that remained unaddressed until now due to a lack of suitable tools. Finally, our pipeline provides scientists and filmmakers with a new, empirically grounded approach for depicting the perceptual worlds of non-human animals.
Publisher
Cold Spring Harbor Laboratory
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献