Deprecated: $wgMWOAuthSharedUserIDs=false is deprecated, set $wgMWOAuthSharedUserIDs=true, $wgMWOAuthSharedUserSource='local' instead [Called from MediaWiki\HookContainer\HookContainer::run in /var/www/html/w/includes/HookContainer/HookContainer.php at line 135] in /var/www/html/w/includes/Debug/MWDebug.php on line 372
The EM Algorithm gives Sample-Optimality for Learning Mixtures of Well-Separated Gaussians - MaRDI portal

Deprecated: Use of MediaWiki\Skin\SkinTemplate::injectLegacyMenusIntoPersonalTools was deprecated in Please make sure Skin option menus contains `user-menu` (and possibly `notifications`, `user-interface-preferences`, `user-page`) 1.46. [Called from MediaWiki\Skin\SkinTemplate::getPortletsTemplateData in /var/www/html/w/includes/Skin/SkinTemplate.php at line 691] in /var/www/html/w/includes/Debug/MWDebug.php on line 372

Deprecated: Use of QuickTemplate::(get/html/text/haveData) with parameter `personal_urls` was deprecated in MediaWiki Use content_navigation instead. [Called from MediaWiki\Skin\QuickTemplate::get in /var/www/html/w/includes/Skin/QuickTemplate.php at line 131] in /var/www/html/w/includes/Debug/MWDebug.php on line 372

The EM Algorithm gives Sample-Optimality for Learning Mixtures of Well-Separated Gaussians

From MaRDI portal
Publication:6333957

arXiv2002.00329MaRDI QIDQ6333957

Author name not available (Why is that?)

Publication date: 2 February 2020

Abstract: We consider the problem of spherical Gaussian Mixture models with kgeq3 components when the components are well separated. A fundamental previous result established that separation of Omega(sqrtlogk) is necessary and sufficient for identifiability of the parameters with polynomial sample complexity (Regev and Vijayaraghavan, 2017). In the same context, we show that ildeO(kd/epsilon2) samples suffice for any epsilonlesssim1/k, closing the gap from polynomial to linear, and thus giving the first optimal sample upper bound for the parameter estimation of well-separated Gaussian mixtures. We accomplish this by proving a new result for the Expectation-Maximization (EM) algorithm: we show that EM converges locally, under separation Omega(sqrtlogk). The previous best-known guarantee required Omega(sqrtk) separation (Yan, et al., 2017). Unlike prior work, our results do not assume or use prior knowledge of the (potentially different) mixing weights or variances of the Gaussian components. Furthermore, our results show that the finite-sample error of EM does not depend on non-universal quantities such as pairwise distances between means of Gaussian components.












This page was built for publication: The EM Algorithm gives Sample-Optimality for Learning Mixtures of Well-Separated Gaussians

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6333957)