MMSE Bounds Under Kullback-Leibler Divergence Constraints on the Joint Input-Output Distribution
From MaRDI portal
Publication:6342227
arXiv2006.03722MaRDI QIDQ6342227
Author name not available (Why is that?)
Publication date: 5 June 2020
Abstract: This paper proposes a new family of lower and upper bounds on the minimum mean squared error (MMSE). The key idea is to minimize/maximize the MMSE subject to the constraint that the joint distribution of the input-output statistics lies in a Kullback-Leibler divergence ball centered at some Gaussian reference distribution. Both bounds are tight and are attained by Gaussian distributions whose mean is identical to that of the reference distribution and whose covariance matrix is determined by a scalar parameter that can be obtained by finding the root of a monotonic function. The upper bound corresponds to a minimax optimal estimator and provides performance guarantees under distributional uncertainty. The lower bound provides an alternative to well-known inequalities in estimation theory, such as the Cram'er-Rao bound, that is potentially tighter and defined for a larger class of distributions. Examples of applications in signal processing and information theory illustrate the usefulness of the proposed bounds in practice.
Has companion code repository: https://github.com/mifauss/KL-Divergence-MMSE-Bounds
No records found.
This page was built for publication: MMSE Bounds Under Kullback-Leibler Divergence Constraints on the Joint Input-Output Distribution
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6342227)