Discussions about information centered, either in favor or against, the use of Shannon’s information theory are not only ill-suited but cannot help the discussion unless it is left behind just as the theory of randomness did early in the 1960s with the theory of algorithmic information. While it is true that physics, and many other areas, have been painfully slow at moving away from Shannon entropy and still can find a wide range of applications for it, we will show how eventually more powerful generating and predictive data-driven models will—and should—replace it. I will argue that discussions in Philosophy of Information should be thus one step ahead instead of several behind, guiding not only the philosophical discussion but also leading and steering scientific attention. I will explain the relevance of algorithmic complexity as a salient property at the core of the scientific method, especially in the challenge of causality discovery, and how fears against moving away from Entropy based upon arguments of uncomputability are unfounded, as it precludes progress and embraces defeat.
Previous Article in event
Next Article in event
Philosophy of Information: the urgent need to move away from entropy towards algorithmic information
Published: 09 June 2017 by MDPI in DIGITALISATION FOR A SUSTAINABLE SOCIETY session Third International Conference on Philosophy of Information
Keywords: algorithmic randomness, Kolmogorov-Chaitin complexity, algorithmic probability, Shannon entropy, philosophy of information