Abstract
ABSTRACTIdentifying parameters of computational models that capture experimental data is a central task in cognitive neuroscience. Bayesian statistical inference aims to not only identify a single configuration of best-fitting parameters, but to recover all model parameters that are consistent with the data and prior knowledge. Statistical inference methods usually require the ability to evaluate the likelihood of the model—however, for many models of interest in cognitive neuroscience, the associated likelihoods cannot be computed efficiently. Simulation-based inference (SBI) offers a solution to this problem by only requiring access to simulations produced by the model. Here, we provide an efficient SBI method for models of decision-making. Our approach, Mixed Neural Likelihood Estimation (MNLE), trains neural density estimators on model simulations to emulate the simulator. The likelihoods of the emulator can then be used to perform Bayesian parameter inference on experimental data using standard approximate inference methods like Markov Chain Monte Carlo sampling. While most neural likelihood estimation methods target continuous data, MNLE works with mixed data types, as typically obtained in decision-making experiments (e.g., binary decisions and associated continuous reaction times). We demonstrate MNLE on two variants of the drift-diffusion model (DDM) and compare its performance to a recently proposed method for SBI on DDMs, called likelihood approximation networks (LAN, Fengler et al. 2021). We show that MNLE is substantially more efficient than LANs, requiring six orders of magnitudes fewer model simulations to achieve comparable likelihood accuracy and evaluation time while providing the same level of flexibility. We include an implementation of our algorithm in the user-friendly open-source package sbi.
Publisher
Cold Spring Harbor Laboratory
Reference71 articles.
1. Accelerating bayesian synthetic likelihood with the graphical lasso;Journal of Computational and Graphical Statistics,2019
2. Anonymous. Variational methods for simulation-based inference. In Submitted to The Tenth International Conference on Learning Representations, 2022. URL https://openreview.net/forum?id=kZ0UYdhqkNY. under review.
3. Pyro: Deep universal probabilistic programming;Journal of Machine Learning Research,2019
4. ChaRTr: An R toolbox for modeling choices and response times in decision-making tasks