Abstract
AbstractThe human brain comprises an intricate web of connections that generate complex neural networks capable of storing and processing information. This information depends on multiple factors, including underlying network structure, connectivity, and interactions; and thus, methods to characterize neural networks typically aim to unravel and interpret a combination of these factors. Here, we present four-dimensional (4D) Shannon’s entropy, a novel quantitative metric of network activity based on the Triple Correlation Uniqueness (TCU) theorem. Triple correlation, which provides a complete and unique characterization of the network, relates three nodes separated by up to four spatiotemporal lags. Here, we evaluate the 4D entropy from the spatiotemporal lag probability distribution function (PDF) of the network activity’s triple correlation. Given a spike raster, we compute triple correlation by iterating over time and space. Summing the contributions to the triple correlation over each of the spatial and temporal lag combinations generates a unique 4D spatiotemporal lag distribution, from which we estimate a PDF and compute Shannon’s entropy. To outline our approach, we first compute 4D Shannon’s entropy from feedforward motif-class patterns in a simulated spike raster. We then apply this methodology to spiking activity recorded from rat cortical cultures to compare our results to previously published results of pairwise (2D) correlated spectral entropy over time. We find that while first- and second-order metrics of activity (spike rate and cross-correlation) show agreement with previously published results, our 4D entropy computation (which also includes third-order interactions) reveals a greater depth of underlying network organization compared to published pairwise entropy. Ultimately, because our approach is based on the TCU, we propose that 4D Shannon’s entropy is a more complete tool for neural network characterization.Author SummaryHere, we present a novel entropy metric for neural network characterization, 4D Shannon’s entropy, based on triple correlation, which measures interactions among up to three neurons in time and space. Per the Triple Correlation Uniqueness (TCU) theorem, our 4D entropy approach is based on a complete and unique characterization of network activity. We first outline the method to obtain 4D Shannon’s entropy using a simulated spike raster of feedforward three-neuron configurations. We then apply this metric to an open-source, experimental dataset of rat cortical cultures over time to show that while first- and second-order interactions (spike rate and cross-correlation) show similar trends to published results, the TCU-based 4D Shannon’s entropy metric provides greater insights into later-stage network activity compared to the published pairwise entropy. As this metric is computed from a 4D distribution unique to the network, we propose that utilization of 4D entropy offers a clear advantage compared to currently utilized pairwise entropy metrics for neural network analyses. For this reason, neuroscientific and clinical applications abound – these may include analysis of distinct dynamical states, characterizing responses to medication, and identification of pathological brain networks, such as seizures.
Publisher
Cold Spring Harbor Laboratory