Please login first
A Note on Bound for Jensen-Shannon Divergence by Jeffreys
1  Kanagawa University

Abstract: The Jensen-Shannon divergence JS(p;q) is a similarity measure between two probability distributions p and q. It is presently used in varied disciplines. In this presentation, we provide a lower bound on the Jensen-Shannon divergence by the Jeffrery's J-divergence when p_i≥q_i is satisfied. In the original Lin's paper, the upper bound in terms of the J-divergence was the quarter of it. Recently, the shaper one was reported by Crooks. We discuss upper bounds by transcendental functions of Jeffreys by comparing those values for a binary distribution.
Keywords: Jensen-Shannon divergence; variational distance; Kullback-Leibler divergence; Jeffreys divergence