Please login first
Only One Nonlinear Non-Shannon Inequality is Necessary for Four Variables
* ,
1  Drexel University

Abstract:

The region of entropic vectors $\overline{\Gamma}^{*}_N$ has been shown to be at the core of determining fundamental limits for network coding, distributed storage, conditional independence relations, and information theory. Characterizing this region is a problem that lies at the intersection of probability theory, group theory, and convex optimization. A $2^{N}$-1 dimensional vector is said to be entropic if each of its entries can be regarded as the joint entropy of a particular subset of $N$ discrete random variables. While the explicit characterization of the region of entropic vectors $\overline{\Gamma}^{*}_N$ is unknown for $N \geqslant4$, here we prove that only one form of nonlinear non-shannon inequality is necessary to fully characterize $\overline{\Gamma}^{*}_4$. We identify this inequality in terms of a function that is the solution to an optimization problem. We also give some symmetry and convexity properties of this function which rely on the structure of the region of entropic vectors and Ingleton inequalities. This result shows that inner and outer bounds to the region of entropic vectors can be created by upper and lower bounding the function that is the answer to this optimization problem.

Keywords: Entropy Vector, Non-Shannon Inequality
Top