Previous Article in event
Next Article in event
Some New Insights Into Information Decomposition in Complex Systems Based on Common Information
Published: 03 November 2014 by MDPI in 1st International Electronic Conference on Entropy and Its Applications session Complex Systems
Abstract: We take a closer look at the structure of bivariate dependency induced by a pair of predictor random variables (X,Y) trying to encode a target random variable, Z. The information that the pair (X,Y) contains about the target Z can have aspects of redundant information (contained identically in both X and Y), of unique information (contained exclusively in either X or Y), and of synergistic information (available when (X,Y) are together). Williams and Beer proposed such a decomposition for the general case of K predictors to specify how the total information about the target is shared across the singleton predictors and their overlapping or disjoint coalitions. However, effecting a non-negative decomposition is known to be a surprisingly difficult problem even for the case of K = 3. In particular, it is not always possible to attribute operational significance to all the atoms induced by the decomposition. What operational questions should an ideal measure of redundant or unique information answer? In this paper, we seek to demonstrate the richness of this question through the lens of network information theory. We show the following: We evaluate a recently proposed measure of redundancy based on the Gács and Körner's common information (Griffith et al., Entropy, vol. 16, no. 4, pp. 1985–2000, 2014) and show that the measure, in spite of its elegance is degenerate for most non-trivial distributions. We show that Wyner's common information also fails to capture the notion of redundancy as it violates an intuitive monotonically non-increasing property. We further show why a combinatorial dual of the Gács and Körner's CI is unremarkable as a measure of unique information. We identify a set of conditions when a conditional version of Gács and Körner's common information is an ideal measure of unique information. Finally, we show how the notions of approximately sufficient statistics and conditional information bottleneck can be used to quantify unique information.
Keywords: Synergy; common information; information lattice; information bottleneck