Improved convergence guarantees for learning Gaussian mixture models by EM and gradient EM
From MaRDI portal
Publication:2233582
DOI10.1214/21-EJS1905zbMath1471.62273arXiv2101.00575OpenAlexW3204209514MaRDI QIDQ2233582
Publication date: 11 October 2021
Published in: Electronic Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2101.00575
Cites Work
- Unnamed Item
- Unnamed Item
- Statistical guarantees for the EM algorithm: from population to sample-based analysis
- A spectral algorithm for learning mixture models
- On the convergence properties of the EM algorithm
- Learning mixtures of separated nonspherical Gaussians
- Statistical convergence of the EM algorithm on Gaussian mixture models
- Efficiently learning mixtures of two Gaussians
- Tight Bounds for Learning a Mixture of Two Gaussians
- Learning mixtures of spherical gaussians
- The Spectral Method for General Mixture Models
- High-Dimensional Probability
- Learning Theory
This page was built for publication: Improved convergence guarantees for learning Gaussian mixture models by EM and gradient EM