Estimating R\'enyi's $\alpha$-Cross-Entropies in a Matrix-Based Way
From MaRDI portal
Publication:6378454
arXiv2109.11737MaRDI QIDQ6378454
Author name not available (Why is that?)
Publication date: 24 September 2021
Abstract: Conventional information-theoretic quantities assume access to probability distributions. Estimating such distributions is not trivial. Here, we consider function-based formulations of cross entropy that sidesteps this a priori estimation requirement. We propose three measures of R'enyi's -cross-entropies in the setting of reproducing-kernel Hilbert spaces. Each measure has its appeals. We prove that we can estimate these measures in an unbiased, non-parametric, and minimax-optimal way. We do this via sample-constructed Gram matrices. This yields matrix-based estimators of R'enyi's -cross-entropies. These estimators satisfy all of the axioms that R'enyi established for divergences. Our cross-entropies can thus be used for assessing distributional differences. They are also appropriate for handling high-dimensional distributions, since the convergence rate of our estimator is independent of the sample dimensionality. Python code for implementing these measures can be found at https://github.com/isledge/MBRCE
Has companion code repository: https://github.com/isledge/mbrce
This page was built for publication: Estimating R\'enyi's $\alpha$-Cross-Entropies in a Matrix-Based Way
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6378454)