Bayesian estimation of the Kullback-Leibler divergence for categorical sytems using mixtures of Dirichlet priors

From MaRDI portal
Publication:6443084

arXiv2307.04201MaRDI QIDQ6443084

Author name not available (Why is that?)

Publication date: 9 July 2023

Abstract: In many applications in biology, engineering and economics, identifying similarities and differences between distributions of data from complex processes requires comparing finite categorical samples of discrete counts. Statistical divergences quantify the difference between two distributions. However, their estimation is very difficult and empirical methods often fail, especially when the samples are small. We develop a Bayesian estimator of the Kullback-Leibler divergence between two probability distributions that makes use of a mixture of Dirichlet priors on the distributions being compared. We study the properties of the estimator on two examples: probabilities drawn from Dirichlet distributions, and random strings of letters drawn from Markov chains. We extend the approach to the squared Hellinger divergence. Both estimators outperform other estimation techniques, with better results for data with a large number of categories and for higher values of divergences.




Has companion code repository: https://github.com/statbiophys/catede








This page was built for publication: Bayesian estimation of the Kullback-Leibler divergence for categorical sytems using mixtures of Dirichlet priors

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6443084)