In this paper we speculate on the existence of a notion of information and an information-theoretic structure that could be generally applicable in systems analysis. From basic principles and by physical analogies we hypothesize on some qualitative and quantitative relationships between entropy and information. Information and entropy are here regarded as the basic, or primitive, system variables, conceptually interrelated in a manner which differs from the traditional ideas on information vis-a-vis entropy. In a quasi-physical manner it is proposed that information be considered as the one intensity and entropy as the one extensity that span the whole physical world. It is then possible to define a conserved quantity, termed informational power, relating information to entropy flow. he above ideas are then generally applied to systems in which the act of measurement appreciably affects the system itself. This can introduce an aspect of uncertainty to large scale systems modelling similar to that occurring in particle physics. The central notions in this paper could provide a mechanism by means of which such aspects could be structurally accommodated in a model.