In recent decades, different definitions of conditional Renyi entropy (CRE) have been introduced. Thus, Arimoto proposed a definition that found an application in information theory, Jizba and Arimitsu proposed a definition that found an application in time series analysis and Renner-Wolf, Hayashi and Cachin proposed definitions that are suitable for cryptographic applications. However, there is still no a commonly accepted definition, nor a general treatment of the CRE-s, which can essentially and intuitively be represented as an average uncertainty about a random variable X if a random variable Y is given. In this paper we fill the gap and propose a three-parameter CRE, which contains all of the previous definitions as special cases that can be obtained by a proper choice of the parameters. Moreover, it satisfies all of the properties that are simultaneously satisfied by the previous definitions, so that it can successfully be used in aforementioned applications. Thus, we show that the proposed CRE is positive, continuous, symmetric, permutation invariant, equal to Rényi entropy for independent X and Y , equal to zero for X = Y and monotonic. In addition, as an example for the further usage, we discuss the properties of generalized mutual information, which is defined using proposed CRE.
Previous Article in event
Previous Article in session
Next Article in event
On a general definition of conditional Renyi entropies
Published:
21 November 2017
by MDPI
in 4th International Electronic Conference on Entropy and Its Applications
session Information and Complexity
Abstract:
Keywords: Renyi entropy, conditional entropy, mutual information