Previous Article in event
Next Article in event Next Article in session
A Note on Bound for Jensen-Shannon Divergence by Jeffreys
Published: 03 November 2014 by MDPI in 1st International Electronic Conference on Entropy and Its Applications session Information Theory
Abstract: The Jensen-Shannon divergence JS(p;q) is a similarity measure between two probability distributions p and q. It is presently used in varied disciplines. In this presentation, we provide a lower bound on the Jensen-Shannon divergence by the Jeffrery's J-divergence when p_i≥q_i is satisfied. In the original Lin's paper, the upper bound in terms of the J-divergence was the quarter of it. Recently, the shaper one was reported by Crooks. We discuss upper bounds by transcendental functions of Jeffreys by comparing those values for a binary distribution.
Keywords: Jensen-Shannon divergence; variational distance; Kullback-Leibler divergence; Jeffreys divergence