Uncertainty, measurement and thermodynamics of information

F. J. Evans, G. Langholz

Research output: Contribution to journalArticlepeer-review

Abstract

In this paper we speculate on the existence of a notion of information and an information-theoretic structure that could be generally applicable in systems analysis. From basic principles and by physical analogies we hypothesize on some qualitative and quantitative relationships between entropy and information. Information and entropy are here regarded as the basic, or primitive, system variables, conceptually interrelated in a manner which differs from the traditional ideas on information vis-a-vis entropy. In a quasi-physical manner it is proposed that information be considered as the one intensity and entropy as the one extensity that span the whole physical world. It is then possible to define a conserved quantity, termed informational power, relating information to entropy flow. he above ideas are then generally applied to systems in which the act of measurement appreciably affects the system itself. This can introduce an aspect of uncertainty to large scale systems modelling similar to that occurring in particle physics. The central notions in this paper could provide a mechanism by means of which such aspects could be structurally accommodated in a model.

Original languageEnglish
Pages (from-to)281-288
Number of pages8
JournalInternational Journal of Systems Science
Volume6
Issue number3
DOIs
StatePublished - Mar 1975

Fingerprint

Dive into the research topics of 'Uncertainty, measurement and thermodynamics of information'. Together they form a unique fingerprint.

Cite this