The estimation of relative entropy measures such as mutual information, conditional and joint entropy, or transfer entropy requires the estimation of conditional and joint densities. When the data are continuous, a multi-variate kernel density estimation or a discretization scheme is usually applied. The problem with this discretization approach is that for mutual information as well as transfer entropy the resulting measure does not converge monotonically to the true value when the number of discrete bins is increased. In the absence of a distribution theory, hypothesis testing is only possible by means of bootstrapping. We propose to estimate the necessary joint and conditional frequencies by means of quantile regression. This allows us to avoid arbitrary binning and all associated problems. Moreover, due to the semi-parametric nature of this approach, the computational burden is decisively reduced compared to multi-variate kernel density estimation. Instead, we show that we can flexibly use quantile regressions to estimate densities in order to calculate joint, conditional and transfer entropy as well as mutual information. Furthermore, by casting our into a Generalized Method of Moments framework, we develop the asymptotic theory to conduct inference on relative entropy measures for multiple variables.
Previous Article in event
Next Article in event
Estimation of Relative Entropy Measures based on Quantile Regression
Published:
05 May 2021
by MDPI
in Entropy 2021: The Scientific Tool of the 21st Century
session Entropy in Multidisciplinary Applications
Abstract:
Keywords: Conditional Entropy Estimation; Joint Entropy Estimation; Mutual Information Estimation; Quantile Regression;