Please login first
What Must the World Be Like to Have Information About It?
1  University of KwaZulu-Natal

Abstract:

In everyday usage, information is knowledge or facts acquired or derived from study, instruction or observation. Information is presumed to be both meaningful and veridical, and to have some appropriate connection to its object. Information might be misleading, but it can never be false. Standard information theory, on the other hand, as developed for communications [1], measurement [2] induction [3; 4] and computation [5; 6], entirely ignores the semantic aspects of information. Thus it might seem to have little relevance to our common notion of information. This is especially true considering the range of applications of information theory found in the literature of a variety of fields. Assuming, however, that the mind works computationally and can get information about things via physical channels, then technical accounts of information strongly restrict any plausible account of the vulgar notion. Some more recent information-oriented approaches to epistemology [7] and semantics [8] go further, though my introduction to the ideas was through Michael Arbib, Michael Scriven and Kenneth Sayre in the profoundly inventive late 60s and early 70s.

In this talk I will look at how the world must be in order for us to have information about it. This will take three major sections: 1) intrinsic information -- there is a unique information in any structure that can be determined using group theory, 2) the physical world (including our minds) must have specific properties in order for us to have information about the world, and 3) the nature of information channels that can convey information to us for evaluation and testing. In the process I will outline theories of physical information and semantic information. Much of the talk will be an, I hope simplified, version of [9] and [10], and other sources on my web page, and the book, Every Thing Must Go [10].

Acknowledgments

I acknowledge the support of the National Research Council of South Africa.

References and Notes

  1. Shannon, C.E. and Weaver, W. 1949. The Mathematical Theory of Communication. Urbana, University of Illinois Press.
  2. Brillouin, L 1962. Science and Information Theory, 2nd edition. New York, Academic Press.
  3. Solomonoff, R. 1964. A formal theory of inductive inference, Part I.Information and Control, Vol 7, No. 1: 1-22.
  4. Solomonoff, R. 1964. A formal theory of inductive inference, Part II.Information and Control, Vol 7, No. 2: 224-254.          
  5. Kolmogorov, A.N. 1965. Three approaches to the quantitative definition of information. Problems of Inform. Transmission 1: 1-7.
  6. Chaitin, G.J. A theory of program size formally identical to information theory. J. ACM 22: 329-340.
  7. Dretske, F. 1981. Knowledge and the Flow of Information. Cambridge, MA, MIT Press.
  8. Barwise, Jon and John Perry. 1983. Situations and Attitudes. Cambridge, MA, MIT Press.
  9. Collier, John 1990. Intrinsic information. in Philip Hanson (ed) Information, Language and Cognition: Vancouver Studies in Cognitive Science, Vol. 1. University of British Columbia Press, now by Oxford University Press: 390-409.
  10. Collier, John. 2012. Information, causation and computation.Information and Computation: Essays on Scientific and Philosophical Understanding of Foundations of Information and Computation. Gordana Dodig Crnkovic and Mark Burgin (eds), Singapore, World Scientific: 89-106.
  11. Ladyman, J., Ross, D., with Collier, J., Spurrett, D. 2007. Every Thing Must Go. Oxford, Oxford University Press.
Keywords: information, realism, epistemology, channels
Top