Information theory is concerned with the study of transmission, processing, extraction, and utilization of information. In its most abstract form, information is conceived as a means of resolving uncertainty. Shannon and Weaver (1949) were among the first to develop a conceptual framework for information theory. One of the key assumptions of the model is that uncertainty increases linearly with the amount of complexity (in bit units) of information transmitted or generated (C.E. Shannon, W. Weaver. The mathematical Theory of communication, University of Illinois, Urbana III, 1949). A whole body of data from the cognitive neurosciences has shown since that the time of human response or action increases in a similar fashion as a function of information complexity in various different situations and contexts. In this paper, I will discuss what is currently known about the limitations of human information processing. The implications for the development of parsimonious computational models in science and the idea of reliable Artificial Intelligence for science and society will be made clear under the light of arguments from the cognitive neurosciences and computational philosophy. The goal of the presentation is to carve out a conceptual framework that is to inspire future studies on the problems identified.
Previous Article in event
Next Article in event
Next Article in session
The limitations of human information processing and their implications for parsimonious computational modelling and reliable Artificial Intelligence
Published:
30 November 2021
by MDPI
in The 1st International Electronic Conference on Information
session Information Processes and Artificial Intelligence
https://doi.org/10.3390/IECI2021-11949
(registering DOI)
Abstract:
Keywords: Information Theory; Uncertainty; Complexity; Human Response Time; Processing Limitations; Parsimonious Computing; Artificial Intelligence