The General Theory of Information as a Unifying Factor for Information Studies
The noble eight-fold path
University of California, Los Angeles,
520 Portola Plaza, Los Angeles, CA 90095, USA
Abstract: We analyze advantages and new opportunities, which the general theory of information (GTI) provides for information studies.
The general theory of information (GTI) is a novel approach, which offers powerful tools for all areas of information studies. The theory has three components:
v The axiomatic foundations
v The mathematical core
v The functional hull
In Section 2, we give a very brief exposition of the axiomatic foundations of the general theory of information. The mathematical core is presented in (Burgin, 1997; 2010; 2011; 2011b; 2011c; 2014) and some other publications. In Section 3, we demonstrate advantages and new opportunities, which the general theory of information (GTI) provides for science in general and information studies, in particular.
- Axiomatic foundations of the general theory of information
The axiomatic foundation consists of principles, postulates and axioms of the general theory of information.
² Principles describe and explain the essence and main regularities of the information terrain.
² Postulates are formalized representations of principles.
² Axioms describe mathematical and operational structures used in the general theory of information.
There are two classes of principles:
v Ontological principles explain the essence of information as a natural and artificial phenomenon.
v Axiological principles explain how to evaluate information and what measures of information are necessary.
At first, we consider ontological principles.
There are three groups of ontological principles:
à Substantial ontological principles [O1, O2 and its modifications O2g, O2a, O2c] define information.
à Existential ontological principles [O3, O4, O7] describe how information exists in the physical world
à Dynamical ontological principles [O5, O6] show how information functions
Ontological Principle O1 (the Locality Principle). It is necessary to separate information in general from information (or a portion of information) for a system R.
In other words, empirically, it is possible to speak only about information (or a portion of information) for a system. This principle separates local and global approaches to information definition, i.e., in what context information is defined.
The Locality Principle explicates an important property of information, but says nothing what information is. The essence of information is described by the second ontological principle, which has several forms.
Ontological Principle O2 (the General Transformation Principle). In a broad sense, information for a system R is a capacity to cause changes in the system R.
Thus, we may understand information in a broad sense as a capacity (ability or potency) of things, both material and abstract, to change other things. Information exists in the form of portions of information.
The Ontological Principle O2 is fundamental as it intimately links information with time. Changes to R, when they occur by reception of information, are defined here to be the result of a causal process. Causality necessarily implies that the related effect happens after its cause. The Ontological Principle O2 leaves open the question whether the potential causal changes may or must be irreversible.
The Ontological Principle O2 unifies dynamic aspects of reality because information in a broad sense projected onto three primal components of reality – physical reality, mental reality and structural reality - amalgamates the conceptions of information, physical energy and mental energy with its special form, psychic energy, in one comprehensive concept.
Being extremely wide-ranging, this definition supplies meaning and explanation to the conjecture of von Weizsäcker that energy might in the end turn out to be information, as well as to the aphorism of Wheeler It from Bit and to the statement of Smolin that the three-dimensional energetic world is the flow of information.
Mental energy is considered as a mood, ability or willingness to engage in some mental work and is often related to the activation level of the mind. The concept stems from an "energy of the soul" introduced by Henry More in his 1642 Psychodia platonica.
Psychic energy has become an essential component of several psychological theories. At first, the concept of psychic energy, also called psychological energy, was developed in the field of psychodynamics by German scientist Ernst Wilhelm von Brücke (1819-1892). Then it was further developed by his student Sigmund Freud (1856-1939) in psychoanalysis. Next step in its development was done by his student Carl Gustav Jung (1875-1961).
Mental energy is innate for any mentality, while psychic energy is related only to the human psyche.
The next principle is
Ontological Principle O2g (the Relativized Transformation Principle). Information for a system R relative to the infological system IF(R) is a capacity to cause changes in the system IF(R).
The concept of infological system plays the role of a free parameter in the general theory of information, providing for representation of different kinds and types of information in this theory. That is why the concept of infological system, in general, should not be limited by boundaries of exact definitions. A free parameter must really be free. Identifying an infological system IF(R) of a system R, we can define different kinds and types of information.
Here are examples from popular information theories:
In Shannon’s information theory (or more exactly, a theory of communication), information is treated as elimination of uncertainty, i.e., as a definite change in the knowledge system of the receptor of information. In the semantic information theory of Bar‑Hillel and Carnap, information causes change in knowledge about the real state of a system under consideration. In algorithmic information theory, information about a constructive object, e.g., a string of symbols, is characterized by construction of this object, while information in one object about another one reflects changes in the systems of construction algorithms.
Taking a physical system D as the infological system and allow only for physical changes, we see that information with respect to D coincides with (physical) energy.
Taking a mental system B as the infological system and considering only mental changes, information with respect to B coincides with mental energy.
Taking a cognitive system C as the infological system and considering only structural changes, information with respect to B coincides with information per se.
As a model example of an infological system IF(R) of an intelligent system R, we take the system of knowledge of R. In cybernetics, it is called the thesaurus Th(R) of the system R. Another example of an infological system is the memory of a computer. Such a memory is a place in which data and programs are stored and is a complex system of diverse components and processes.
The concept of an infological system shows that not only living beings receive and process information. For instance, it is natural to treat the memory of a computer as an infological system. Then what changes this memory is information for the computer.
Ontological Principle O2a (the Special Transformation Principle). Information in the strict sense or proper information or, simply, information for a system R, is a capacity to change structural infological elements from an infological system IF(R) of the system R.
There is no exact definition of infological elements although there are various entities that are naturally considered as infological elements as they allow one to build theories of information that inherit conventional meanings of the word information. For instance, knowledge, data, images, algorithms, procedures, scenarios, ideas, values, goals, ideals, fantasies, abstractions, beliefs, and similar objects are standard examples of infological elements. Note that all these elements are structures and not physical things. That is why, we use structural infological elements per se for identifying information in the strict sense.
This allows giving an esthetically eye-catching description of information:
Information is energy in the Platonic World of Ideas
Ontological Principle O2c (the Cognitive Transformation Principle). Cognitive information for a system R, is a capacity to cause changes in the cognitive infological system IFC(R) of the system R.
An infological system IF(R) of the system R is called cognitive if IF(R) contains (stores) elements or constituents of cognition, such as knowledge, data, ideas, fantasies, abstractions, beliefs, etc. A cognitive infological system of a system R is denoted by CIF(R) and is related to cognitive information.
After we outlined (defined) the concept information, let us consider how information exists in the physical world.
Ontological Principle O3 (the Embodiment Principle). For any portion of information I, there is always a carrier C of this portion of information for a system R.
The substance C that is a carrier of the portion of information I is called the physical, or material, carrier of I.
Ontological Principle O4 (the Representability Principle). For any portion of information I, there is always a representation C of this portion of information for a system R.
Ontological Principle O5 (the Interaction Principle). A transaction/transition/transmission of information goes on only in some interaction of C with R.
Ontological Principle O6 (the Actuality Principle). A system R accepts a portion of information I only if the transaction/transition/transmission causes corresponding transformations in R.
Ontological Principle O7 (the Multiplicity Principle). One and the same carrier C can contain different portions of information for one and the same system R.
Now we give a list of axiological principles.
Axiological Principle A1. A measure of information I for a system R is some measure of changes caused by I in R (for information in the strict sense, in IF(R)).
Note that it is possible to take the quantity of resources used for inflicting changes caused by information I in a system R as a measure of these changes and consequently, as a measure of information I.
Axiological Principle A2. One carrier C can contain different portions of information for a given system R.
Axiological Principle A3. According to time orientation, there are three types of measures of information: 1) potential or perspective; 2) existential or synchronic; 3) actual or retrospective.
Axiological Principle A4. According to the scale of measurement, there are two groups, each of which contains three types of measures of information: (1) qualitative measures, which are divided into descriptive, operational and representational measures, and (2) quantitative measures, which are divided into numerical, comparative and splitting measures.
Axiological Principle A5. According to spatial orientation, there are three types of measures of information: external, intermediate, and internal.
Axiological Principle A6. Information I, which is transmitted from a carrier C to a system R, depends on interaction between C and R.
Axiological Principle A7. Measure of information transmission from a carrier C to a system R reflects a relation (like ratio, difference etc.) between measures of information that is admitted by the system R in the process of transmission and information that is presented by C in the same process.
- The general theory of information as a unifying factor for information studies
First, the general theory of information gives a flexible, efficient and all-encompassing definition of information. In contrast to other definitions and descriptions used before, this definition is parametric allowing specification of information in general, as well as information in any domain of nature, society and technology.
Even more, the new definition taken in broad context make it possible to unite the conceptions of information, physical energy and psychic energy in one comprehensive concept. Being extremely wide-ranging, this definition supplies meaning and explanation to the conjecture of von Weizsäcker that energy might in the end turn out to be information as well as to the aphorism of Wheeler It from Bit.
This shows that the general theory of information provides means for a synthesis of physics, psychology and information science playing the role of a metatheory for these scientific areas.
At the same time, the new definition characterizes proper information when the general concept is specified by additional principles. The construction of an infological system allows researchers to exactly delineate information in the area of their studies.
Second, the general theory of information explains and makes available constructive tools for discerning information, measures of information, information representations and carriers of information. For instance, taking a letter written on a piece of paper, we see that the paper is the carrier of information, the text on it is the representation of the information contained in this text and it is possible to measure the quantity of this information using Shannon entropy or algorithmic complexity.
Third, the general theory of information provides efficient mathematical models. There are models of three types: information algebras, operator models based on functional analysis and operator models based on category theory. Functional representations of information dynamics preserve internal structures of information spaces associated with infological systems as their state or phase spaces. Categorical representations of information dynamics display external structures of information spaces associated with infological systems. Algebraic representations of information dynamics maintain intermediate structures of information spaces. These models allow researchers to discover intrinsic properties of information.
Fourth, the general theory of information supplies methodological and theoretical tools for the development of measurement and evaluation technologies in information studies and information technology. Moreover, any science needs theoretical and practical means for making grounded observations and measurements. Different researchers in information theory have developed many methods and measures. The most popular of them are Shannon’s entropy and algorithmic complexity. The general theory of information unifies all these approaches opening new possibilities for building efficient methods and measures in areas where the currently used methods and measures are not applicable.
Fifth, the general theory of information offers organization and structuration of the system of all existing information theories.
However, it is important to understand that this unifying feature and all advantages of the general theory of information do not exclude necessity in special theories of information, which being more specific, can go deeper in their investigation of properties of information and information processes in various areas. For instance, syntactic information theories, such as Shannon’s theory, are very useful in the area of communication. Algorithmic information theories, such as the theory of Kolmogorov complexity, are very useful in the area of automata, computation and algorithms. There are also semantic, pragmatic, economic, semiotic and other special information theories, each of which is directed at investigation of specific properties of information, information processes and systems.
Sixth, the general theory of information explicates the relevant relations between information, knowledge and data demonstrating that while knowledge and data are objects of the same type with knowledge being more advanced than data, information has a different type. These relations are expressed by the Knowledge-Information-Matter-Energy Square:
information is related to knowledge (data) as energy is related to matter
In particular, it is possible to transform knowledge or data into information as we can transform matter into energy.
Seventh, the general theory of information rigorously represents static, dynamic and functional aspects and features of information. These features are modeled and explored by algebraic, topological and analytical structures of operators in functional spaces and functors in the categorical setting forming information algebras, calculi and topological spaces.
Eighth, the general theory of information explicates and elucidates the role of information in nature, cognition, society and technology clarifying important ontological, epistemological and sociological issues. For instance, this theory explains why popular but not exact and sometimes incorrect publications contain more information for people in general than advanced scientific works with outstanding results.
- Burgin, M. Theory of Information: Fundamentality, Diversity and Unification, World Scientific, New York/London/Singapore, 2010
- Burgin, M. (1997) Information Algebras, Control Systems and Machines, No. 6, pp. 5-16 (in Russian)
- Burgin, M. (2003) Information Theory: A Multifaceted Model of Information, Entropy, v. 5, No. 2, pp. 146-160
- Burgin, M. (2004) Data, Information, and Knowledge, Information, v. 7, No.1, pp. 47-57
- Burgin, M. (2010) Information Operators in Categorical Information Spaces, Information, v. 1, No.1, pp. 119 - 152
- Burgin, M.( 2011) Information in the Structure of the World, Information: Theories & Applications, v.18, No. 1, pp. 16 - 32
- Burgin, M. (2011a) Information: Concept Clarification and Theoretical Representation, TripleC, v. 9, No.2, pp. 347-357 (http://triplec.uti.at)
- Burgin, M. (2011b) Epistemic Information in Stratified M-Spaces, Information, v. 2, No.2, pp. 697 - 726
- Burgin, M. (2011c) Information Dynamics in a Categorical Setting, in Information and Computation, World Scientific, New York/London/Singapore, pp. 35 - 78
- Burgin, M. (2013) Evolutionary Information Theory, Information, v. 4, No.2, 2013, pp. 224 – 268
- Burgin, M. (2014) Weighted E-Spaces and Epistemic Information Operators, Information, v. 5, No. 3, pp.357 - 388
- Chaitin, G.J. (1977) Algorithmic information theory, IBM Journal of Research and Development, v.21, No. 4, pp. 350-359
- Fisher, R. A. Contributions to Mathematical Statistics, New York, Wiley, 1950
- Frieden, R.B. Physics from Fisher Information, Cambridge University Press, Cambridge, 1998
- Shannon, C.E. (1948) The Mathematical Theory of Communication, Bell System Technical Journal, v. 27, No. 1, pp. 379-423; No. 3, pp.623-656
- Smolin, L. The Life of the Cosmos, Oxford University Press, Oxford/ New York, 1999
- von Bayer, H.C. Information: The New Language of Science, Harvard University Press, Harvard, 2004
- von Weizsäcker, C.F. Die Einheit der Natur, Deutscher Taschenbuch Verlag, Munich, Germany, 1974
- Wheeler, J.A. (1990) Information, Physics, Quantum: The Search for Links, in Complexity, Entropy, and the Physics of Information, Addison-Wesley, Redwood City, CA, pp. 3–28