Optimal Kullback-Leibler aggregation in mixture density estimation by maximum likelihood

From MaRDI portal
Revision as of 06:43, 1 February 2024 by Import240129110113 (talk | contribs) (Created automatically from import240129110113)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Publication:1737972

DOI10.4171/MSL/1-1-1zbMATH Open1416.62193arXiv1701.05009OpenAlexW3121931924WikidataQ129921099 ScholiaQ129921099MaRDI QIDQ1737972

Author name not available (Why is that?)

Publication date: 24 April 2019

Published in: (Search for Journal in Brave)

Abstract: We study the maximum likelihood estimator of density of n independent observations, under the assumption that it is well approximated by a mixture with a large number of components. The main focus is on statistical properties with respect to the Kullback-Leibler loss. We establish risk bounds taking the form of sharp oracle inequalities both in deviation and in expectation. A simple consequence of these bounds is that the maximum likelihood estimator attains the optimal rate ((logK)/n)1/2, up to a possible logarithmic correction, in the problem of convex aggregation when the number K of components is larger than n1/2. More importantly, under the additional assumption that the Gram matrix of the components satisfies the compatibility condition, the obtained oracle inequalities yield the optimal rate in the sparsity scenario. That is, if the weight vector is (nearly) D-sparse, we get the rate (DlogK)/n. As a natural complement to our oracle inequalities, we introduce the notion of nearly-D-sparse aggregation and establish matching lower bounds for this type of aggregation.


Full work available at URL: https://arxiv.org/abs/1701.05009



No records found.


No records found.








This page was built for publication: Optimal Kullback-Leibler aggregation in mixture density estimation by maximum likelihood

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1737972)