We focus on an important property upon generalization of the Kullback-Leibler divergence used in nonextensive statistical mechanics, i.e., bounds. Weexplicitly show upper and lower bounds on it in terms of existing familiar divergences based on the ï¬nite range of the probability distribution ratio. This provides a link between the observed distribution functions based on histograms of events and parameterized distance measures in physical sciences. The charactering parameter q 0 and q 1 are rejected from the consideration of bounded divergence.DOI : http://dx.doi.org/10.22342/jims.19.2.165.89-97
Copyrights © 2013