Abstract
The complexity of a general system is identified with its temperature and, analogously with Boltzmann's probability density in thermodynamics, this temperature is related to the informational entropy of the system. The concept of informational entropy of deterministic functions provides a straightforward modelling of Brillouin's negentropy (negative entropy), therefore a system can be characterized by its complexity and its dual complexity. States composition laws for complexities expressed in terms of Shannonian entropy with or without probability, and then the approach is extended to quantum entropy of non‐probabilistic data. Outlines some suggestions for future investigation.
Subject
Computer Science (miscellaneous),Social Sciences (miscellaneous),Theoretical Computer Science,Control and Systems Engineering,Engineering (miscellaneous)