Please login first
Mutual Redundancies and Triple Contingencies among Perspectives on Horizons of Meaning
1  Amsterdam School of Communication Research (ASCoR), University of Amsterdam, PO Box 15793, 1001 NG Amsterdam, The Netherlands

Abstract:

Introduction

Unlike Shannon-type information—that is, the uncertainty in a probability distribution (Shannon, 1948, p. 10)—meaning can only be provided with reference to a system for which “the differences make a difference” at a place (MacKay, 1969; Bateson, 1973, p. 315). I argue that systems can be considered as densities in distributions of relations. However, the sets relate at the systems level not in terms of relations, but also in terms of correlations. Because of potentially spurious correlations among two distributions of relations given a third one, uncertainty can also be reduced in the case of interactions among three (or more) sources of variation (Garner & McGill, 1956). In the case of a third agent, each position is cor-relationally defined in terms of the vector space that is spanned—as an architecture—by the set(s) of relations. This communication at the systems level can be expressed as mutual information in the overlap among the sets—or with the opposite sign of reduction of uncertainty as mutual redundancy (Leydesdorff & Ivanova, 2014).

On top of the information exchanges and the correlations among the meanings, discursive knowledge develops by relating meanings reflexively on the basis of cognitive codes that remain mentally and socially constructed. Positions first make it possible to develop perspectives; translations among perspectives provide a third layer of the exchange on top of information processing in relations and the redundancy potentially generated when meanings are shared. The third layer develops as horizons of meaning that can be entertained reflexively, and that enable us to translate among meanings.

For example, a perspective can be used to develop discursively a rationalized system of expectations, and thus to generate knowledge at each individual level by codifying specific meanings. The codification, however, provides an additional selection mechanism: the translation among perspectives thus adds a third layer by potentially codifying communication at the supra-individual level on top of the information and meaning processing. In this context, the notion of “double contingency” (Parsons, 1968, p. 436; Parsons & Shills, 1951, p. 16) can be extended to a “triple contingency” (Strydom, 1999, p. 12). Meaningful information can first be selected from the Shannon-type information fluxes on the basis of codes that are further developed in the reflexive communications among us about expectations. This third layer enables us to develop models of possibly future states.

The three layers operate in parallel. The construction of this triple-layered system is bottom-up, but—using a cybernetic principle—control can increasingly be top-down as the feedback layers are further developed (Ashby, 1958). Whereas the three contingencies can be expected to develop in parallel, this assumption of inversion enables us to hypothesize a hierarchy among the layers that can be expected for analytical reasons. Let me stepwise extend the single-layered and linear Shannon-model (Figure 1 below) into such a triple-layered model in Figure 2.

Extensions of the Shannon-Weaver Model

As is well known, Shannon (1948, p. 3) first focused on information that was not (yet) meaningful: “Frequently the messages have meaning; that is they refer to or are correlated to some system with certain physical or conceptual entities.” According to Shannon (1948, p. 3), however, “(t)hese semantic aspects of communication are irrelevant to the engineering problem.” It is less well known that Shannon’s co-author Warren Weaver argued that Shannon’s distinction between information and meaning “has so penetratingly cleared the air that one is now, for the first time, ready for a real theory of meaning” (Shannon & Weaver, 1949, p. 27). Weaver (1949, p. 26) proposed to insert thereto another box with the label “semantic noise” into the Shannon model between the information source and the transmitter, as follows (Figure 1):

Figure 1. Schematic diagram of a general communication system. Source: Shannon (1948, p. 380); with Weaver’s box of “semantic noise” first added (to the left) and then further extended with a second source of “semantic noise” between the receiver and the destination (to the right).

(see PDF version for the Figure).

What if one adds a similar box to the right side of this figure between the receiver and the destination of the message (added in grey to Figure 1)? The two sources of semantic noise may be correlated; for example, when the sender and receiver of the message share a language or, more generally, a code of communication. I propose to distinguish between “language” as the natural—that is, undifferentiated—code of communication versus codes of communication which can be symbolically generalized and then no longer require the use of language (Luhmann, 2002 and 2012, pp. 120 ff.; Parsons, 1968). For example, instead of negotiating about the price of a commodity, one can simply pay the market price using money as a symbolically generalized medium of communication. One is able to translate reflexively among codes of communication by elaborating upon the different meanings of the information in language (Bernstein, 1971).[1]

Thus, one arrives at the following model (Figure 2):

Figure 2. Three mutual contingencies in the dynamics of codified knowledge.

(see PDF version for the Figure).

In other words, one can distinguish between “meaningful information”—potentially reducing uncertainty—and Shannon-type information that is by definition equal to uncertainty (Hayles, 1990, p. 59). Shannon (1948) chose his formulas so that uncertainty could be measured as probabilistic entropy in bits of information. The mathematical theory of communication provides us with entropy statistics that can be used in different domains (Bar-Hillel, 1955; Krippendorff, 1986; Theil, 1972). Meaning is provided to the information from the perspective of hindsight (of the “later event”—that is, as a system of reference). However, the measurement of “meaningful information” in bits or otherwise had remained hitherto without an operationalization (cf. Dretske, 1981).

Perspectives

In my presentation, I explore two venues for the generation and measurement of negative entropy:

(1) Dubois’ (1998) proposal to distinguish between recursive routines with the arrow of time—necessarily generating entropy—and incursive ones against the arrow of time and thus reducing uncertainty in terms of: (i) in the case of recursion: xt = f(xt-1); (ii) in the case of incursion: xt = f(xt); or (iii) hyper-incursion: xt = f(xt+1). The codes as mental constructs operate in terms of structures of expectations and thus hyper-incursively on the ongoing trajectories of instantiations. The instantiations operate in the present (that is, incursively), whereas the trajectories develop historically along the arrow of time (that is, recursively).

(2) Mutual redundancy in three or more dimensions provides us with a measure of the resulting potential for options in a configuration of expectations other than the ones historically realized: Kauffman’s (2000) “adjacent others.” Mutual redundancies can be generated when the uncertainty is appreciated from three or more different perspectives in a static design or among the three layers of communication distinguished in Figure 2 dynamically, that is, in terms of recursion, incursion, and hyper-incursion. The latter dynamic requires human intelligence since one has to be able to entertain expectations with respect to the expectations of the other in a “double contingency” (Parsons, 1968, p. 436; Parsons & Shills, 1951, p. 16). The communication among perspectives (at the supra-individual level) can then be expected to provide a third contingency (Strydom, 1999, p. 12).

References and Notes

  1. Ashby, W. R. (1958). Requisite variety and its implications for the control of complex systems. Cybernetica, 1(2), 1-17.
  2. Bar-Hillel, Y. (1955). An Examination of Information Theory. Philosophy of Science, 22, 86-105.
  3. Bateson, G. (1972). Steps to an Ecology of Mind. New York: Ballantine.
  4. Bernstein, B. (1971). Class, Codes and Control, Vol. 1: Theoretical studies in the sociology of language. London: Routledge & Kegan Paul.
  5. Dretske, F. I. (1981). Knowledge and the flow of information. Cambridge, MA: MIT Press Mass.
  6. Dubois, D. M. (1998). Computing Anticipatory Systems with Incursion and Hyperincursion. In D. M. Dubois (Ed.), Computing Anticipatory Systems, CASYS-First International Conference (Vol. 437, pp. 3-29). Woodbury, NY: American Institute of Physics.
  7. Garner, W. R., & McGill, W. J. (1956). The relation between information and variance analyses. Psychometrika, 21(3), 219-228.
  8. Krippendorff, K. (1986). Information Theory. Structural Models for Qualitative Data. Beverly Hills, etc.: Sage).
  9. Leydesdorff, L., & Ivanova, I. A. (2014). Mutual Redundancies in Inter-human Communication Systems: Steps Towards a Calculus of Processing Meaning. Journal of the Association for Information Science and Technology, 65(2), 386-399.
  10. Luhmann, N. (2002). How Can the Mind Participate in Communication? In W. Rasch (Ed.), Theories of Distinction: Redescribing the Descriptions of Modernity (pp. 169–184). Stanford, CA: Stanford University Press.
  11. Luhmann, N. (2012). Theory of Society, Vol. 1. Stanford, CA: Stanford University Press.
  12. MacKay, D. M. (1969). Information, Mechanism and Meaning. Cambridge and London: MIT Press.
  13. Parsons, T. (1968). Interaction: I. Social Interaction. In D. L. Sills (Ed.), The International Encyclopedia of the Social Sciences (Vol. 7, pp. 429-441). New York: McGraw-Hill.
  14. Parsons, T., & Shils, E. A. (1951). Toward a General Theory of Action. New York: Harper and Row.
  15. Shannon, C. E. (1948). A Mathematical Theory of Communication. Bell System Technical Journal, 27, 379-423 and 623-656.
  16. Strydom, P. (1999). Triple Contingency: The theoretical problem of the public in communication societies. Philosophy & Social Criticism, 25(2), 1-25.
  17. Theil, H. (1972). Statistical Decomposition Analysis. Amsterdam/ London: North-Holland.
  18. Weaver, W. (1949). Some Recent Contributions to the Mathematical Theory of Communication. In C. E. Shannon & W. Weaver (Eds.), The Mathematical Theory of Communication (pp. 93-117.). Urbana: University of Illinois Press.

[1] I deviate here from Luhmann’s theory. In his theory, the sub-systems of communication are operationally closed and communications cannot be transmitted reflexively from one system into another (cf. Callon, 1998; Leydesdorff, 2006 and 2010a).

Keywords: codification, anticipation, incursion, meaning, modeling
Top