Abstract
ABSTRACTMany predatory animals rely on accurate sensory perception, predictive models, and precise pursuits to catch moving prey. Larval zebrafish intercept paramecia during their hunting behavior, but the precise trajectories of their prey have never been recorded in relation to fish movements in three dimensions.As a means of uncovering what a simple organism understands about its physical world, we have constructed a 3D-imaging setup to simultaneously record the behavior of larval zebrafish, as well as their moving prey, during hunting. We show that zebrafish robustly transform their 3D displacement and rotation according to the position of their prey while modulating both of these variables depending on prey velocity. This is true for both azimuth and altitude, but particulars of the hunting algorithm in the two planes are slightly different to accommodate an asymmetric strike zone. We show that the combination of position and velocity perception provides the fish with a preferred future positional estimate, indicating an ability to project trajectories forward in time. Using computational models, we show that this projection ability is critical for prey capture efficiency and success. Further, we demonstrate that fish use a graded stochasticity algorithm where the variance around the mean result of each swim scales with distance from the target. Notably, this strategy provides the animal with a considerable improvement over equivalent noise-free strategies.In sum, our quantitative and probabilistic modeling shows that zebrafish are equipped with a stochastic recursive algorithm that embodies an implicit predictive model of the world. This algorithm, built by a simple set of behavioral rules, allows the fish to optimize their hunting strategy in a naturalistic three-dimensional environment.
Publisher
Cold Spring Harbor Laboratory