Abstract
In previous work, we described the geometry of Bayesian learning on a manifold. In this paper, inspired by the notion of modified double contingency of communications from sociologist Niklas Luhmann, we take two manifolds in equal parts and a potential function on their product to set up mutual Bayesian learning. Particularly, given a parametric statistical model, we consider mutual learning between two copies of the parameter space. Here, we associate the potential with the relative entropy (i.e., the Kullback–Leibler divergence). Although the mutual learning forgets all elements about the model except the relative entropy, it still substitutes for the usual Bayesian estimation of the parameter in a certain case. We propose it as a globalization of the information geometry.
Subject
Physics and Astronomy (miscellaneous),General Mathematics,Chemistry (miscellaneous),Computer Science (miscellaneous)
Reference9 articles.
1. Mori, A. (2020). Global Geometry of Bayesian Statistics. Entropy, 22.
2. Parsons, T. (1951). The Social System, Free Press.
3. Geyer, R.F., and van der Zouwen, J. (1986). Sociocybernetic Paradoxes: Observation, Control and Evolution of Self-Steering Systems, Sage.
4. Amari, S. (2016). Information Geometry and Its Applications, Springer.
5. Information geometry in a global setting;Mori;Hiroshima Math. J.,2018