The technical concept of information developed after Shannon (1948) and those who followed has fueled advances in many fields, from fundamental physics to bioinfomatics, but its quantitative precision and its breadth of application have come at a cost. It has impeded its usefulness in fields distinguished by the need to explain reference and functional significance, such as evolutionary biology, cognitive neuroscience, the social sciences, and even the humanities. In order to provide the foundation for a theory of information that is sufficiently precise and formal to serve these needs it is necessary to expand and slightly reformulate the technical concept of information in a way that accounts for these properties, which are not intrinsic statistical properties of the conveying medium. Ironically, I will argue that this nevertheless requires attending to physical properties of the information medium with respect to those of its context—and specifically the relationship between thermodynamic and information entropies. I will argue that referential information is a function of the thermodynamic openness of the medium to physical work and thus modifiability of the constraints that it embodies. Susceptibility to physical work is also the relevant measure when it comes to assessing the significance or usefulness of information, since this is reflected the amount of work “saved” as a result of access to information about certain contextual factors. This suggest a way to re-integrate these previously set aside properties of reference and significance into a rigorous analysis of information that can be made compatible with the full range of semiotic issues.
Previous Article in event
Next Article in event
How Information Lost Its Meaning (and How to Recover It)
Published:
30 June 2015
by MDPI
in ISIS Summit Vienna 2015—The Information Society at the Crossroads
session Invited Speech
Abstract:
Keywords: information, reference, significance, entropy, work