Abstract
AbstractCross-modal temporal recalibration is crucial for maintaining coherent perception in a multimodal environment. The classic view suggests that cross-modal temporal recalibration aligns the perceived timing of sensory signals from different modalities, such as sound and light, to compensate for physical and neural latency differences. However, this view cannot fully explain the nonlinearity and asymmetry observed in audiovisual recalibration effects: the amount of re-calibration plateaus with increasing audiovisual asynchrony and varies depending on the leading modality of the asynchrony during exposure. To address these discrepancies, our study examines the mechanism of audiovisual temporal recalibration through the lens of causal inference, considering the brain’s capacity to determine whether multimodal signals come from a common source and should be integrated, or else kept separate. In a three-phase recalibration paradigm, we manipulated the adapter stimulus-onset asynchrony in the exposure phase across nine sessions, introducing asynchronies up to 0.7 s of either auditory or visual lead. Before and after the exposure phase in each session, we measured participants’ perception of audiovisual relative timing using a temporal-order-judgment task. We compared models that assumed observers re-calibrate to approach either the physical synchrony or the causal-inference-based percept, with uncertainties specific to each modality or comparable across them. Modeling results revealed that a causal-inference model incorporating modality-specific uncertainty captures both the nonlinearity and asymmetry of audiovisual temporal recalibration. Our results indicate that human observers employ causal-inference-based percepts to recalibrate cross-modal temporal perception.
Publisher
Cold Spring Harbor Laboratory
Reference57 articles.
1. Acerbi, L. , & Ma, W. J. (2017). Practical bayesian optimization for model fitting with bayesian adaptive direct search. Proceedings of the 31st International Conference on Neural Information Processing Systems, 1834–1844.
2. Akaike, H. (1998). Information theory and an extension of the maximum likelihood principle. In E. Parzen , K. Tanabe , & G. Kitagawa (Eds.), Selected papers of hirotugu akaike (pp. 199–213). Springer New York.
3. To integrate or not to integrate: Temporal dynamics of hierarchical Bayesian causal inference
4. Badde, S. , Ley, P. , Rajendran, S. S. , Shareef, I. , Kekunnaya, R. , & Röder, B. (2020). Sensory experience during early sensitive periods shapes cross-modal temporal biases. Elife, 9.
5. Modality-specific attention attenuates visual-tactile integration and recalibration effects by reducing prior expectations of a common source for vision and touch