Author:
Cruz Tomás,Fujiwara Terufumi,Varela Nélia,Mohammad Farhan,Claridge-Chang Adam,Chiappe M Eugenia
Abstract
AbstractCourse control is critical for the acquisition of spatial information during exploration and navigation, and it is thought to rely on neural circuits that process locomotive-related multimodal signals. However, which circuits underlie this control, and how multimodal information contributes to the control system are questions poorly understood. We used Virtual Reality to examine the role of self-generated visual signals (visual feedback) on the control of exploratory walking in flies. Exploratory flies display two distinct motor contexts, characterized by low speed and fast rotations, or by high speed and slow rotations, respectively. Flies use visual feedback to control body rotations, but in a motor-context specific manner, primarily when walking at high speed. Different populations of visual motion-sensitive cells estimate body rotations via congruent, multimodal inputs, and drive compensatory rotations. However, their effective contribution to course control is dynamically tuned by a speed-related signal. Our data identifies visual networks with a multimodal circuit mechanism for adaptive course control and suggests models for how visual feedback is combined with internal signals to guide exploratory course control.
Publisher
Cold Spring Harbor Laboratory
Cited by
8 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献