Affiliation:
1. Department of Computer Science, University of Vermont 1 , Burlington, Vermont 05405, USA
2. Vermont Complex Systems Center, University of Vermont 2 , Burlington, Vermont 05405, USA
Abstract
There has recently been an explosion of interest in how “higher-order” structures emerge in complex systems comprised of many interacting elements (often called “synergistic” information). This “emergent” organization has been found in a variety of natural and artificial systems, although at present, the field lacks a unified understanding of what the consequences of higher-order synergies and redundancies are for systems under study. Typical research treats the presence (or absence) of synergistic information as a dependent variable and report changes in the level of synergy in response to some change in the system. Here, we attempt to flip the script: rather than treating higher-order information as a dependent variable, we use evolutionary optimization to evolve boolean networks with significant higher-order redundancies, synergies, or statistical complexity. We then analyze these evolved populations of networks using established tools for characterizing discrete dynamics: the number of attractors, the average transient length, and the Derrida coefficient. We also assess the capacity of the systems to integrate information. We find that high-synergy systems are unstable and chaotic, but with a high capacity to integrate information. In contrast, evolved redundant systems are extremely stable, but have negligible capacity to integrate information. Finally, the complex systems that balance integration and segregation (known as Tononi–Sporns–Edelman complexity) show features of both chaosticity and stability, with a greater capacity to integrate information than the redundant systems while being more stable than the random and synergistic systems. We conclude that there may be a fundamental trade-off between the robustness of a system’s dynamics and its capacity to integrate information (which inherently requires flexibility and sensitivity) and that certain kinds of complexity naturally balance this trade-off.
Reference52 articles.
1. T. F.
Varley
, “Information theory for complex systems scientists,” arXiv:2304.12482 (2023).
2. Emergence as the conversion of information: A unifying theory;Philos. Trans. R. Soc. A: Math., Phys. Eng. Sci.,2022
3. Greater than the parts: A review of the information decomposition approach to causal emergence;Philos. Trans. R. Soc. A: Math., Phys. Eng. Sci.,2022
4. P. L.
Williams
and R. D.Beer, “Nonnegative decomposition of multivariate information,” arXiv:1004.2515 (2010).
5. R. A. A.
Ince
, “The partial entropy decomposition: Decomposing multivariate entropy and mutual information via pointwise common surprisal,” arXiv:1702.01591 (2017).
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献