Please login first
Hellinger Entropy Concept: multidisciplinary applications.
* 1 , 2
1  Intelligent & Digital Systems, R&Di, Instituto de Soldadura e Qualidade, 4415 - 491 Grijó, Portugal
2  University of Florida, Gainesville, FL, USA

Abstract:

The use of a metric to assess distance between probability densities is an important practical problem used in artificial intelligence or recommendation systems. The generalized α-formalisms introduced by Rényi and Tsallis are the basis of well-known entropies and divergence models. A particular α-divergence that, was presented in a previous work from the co-authors. This particular α-divergence, in our perspective, was already essentially defined by Hellinger. The concept of Hellinger entropy makes it possible, through a maximum-entropy syllogism, to state a bound for the Hellinger metric. The square root divergence is a metric, and its nonparametric estimator has information-theoretic bounds, that can be directly computed from the data. Information-theoretic bounds for Hellinger distance are developed in this work. The asymptotic behavior allows to use this metric, in a competitive scenario with three or more densities, like clustering. The bound can be directly computed from the data making this method suitable for streaming data.

Keywords: Hellinger Entropy, Hellinger Metric, Maximum Entropy, Streaming Data
Top