Abstract
AbstractObjectiveClose-loop control of brain and behavior will benefit from real-time detection of behavioral events to enable low-latency communication with peripheral devices. In animal experiments, this is typically achieved by using sparsely distributed (embedded) sensors that detect animal presence in select regions of interest. High-speed cameras provide high-density sampling across large arenas, capturing the richness of animal behavior, however, the image processing bottleneck prohibits real-time feedback in the context of rapidly evolving behaviors.ApproachHere we developed an open-source software, named PolyTouch, to track animal behavior in large arenas and provide rapid close-loop feedback in ~5.7 ms, ie. average latency from the detection of an event to analog stimulus delivery, e.g. auditory tone, TTL pulse, when tracking a single body. This stand-alone software is written in JAVA. The included wrapper for MATLAB provides experimental flexibility for data acquisition, analysis and visualization.Main resultsAs a proof-of-principle application we deployed the PolyTouch for place awareness training. A user-defined portion of the arena was used as a virtual target; visit (or approach) to the target triggered auditory feedback. We show that mice develop awareness to virtual spaces, tend to stay shorter and move faster when they reside in the virtual target zone if their visits are coupled to relatively high stimulus intensity (≥49dB). Thus, close-loop presentation of perceived aversive feedback is sufficient to condition mice to avoid virtual targets within the span of a single session (~20min).SignificanceNeuromodulation techniques now allow control of neural activity in a cell-type specific manner in spiking resolution. Using animal behavior to drive closed-loop control of neural activity would help to address the neural basis of behavioral state and environmental context-dependent information processing in the brain.
Publisher
Cold Spring Harbor Laboratory