Learning mixtures of spherical gaussians
From MaRDI portal
Publication:2986854
DOI10.1145/2422436.2422439zbMath1362.68246arXiv1206.5766OpenAlexW2014565165MaRDI QIDQ2986854
Publication date: 16 May 2017
Published in: Proceedings of the 4th conference on Innovations in Theoretical Computer Science (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1206.5766
Estimation in multivariate analysis (62H12) Learning and adaptive systems in artificial intelligence (68T05)
Related Items
GAT–GMM: Generative Adversarial Training for Gaussian Mixture Models, Low Permutation-rank Matrices: Structural Properties and Noisy Completion, Tensor decomposition for learning Gaussian mixtures from moments, Learning diagonal Gaussian mixture models and incomplete tensor decompositions, A new method of moments for latent variable models, Noisy tensor completion via the sum-of-squares hierarchy, On the optimization landscape of tensor decompositions, Overcomplete Order-3 Tensor Decomposition, Blind Deconvolution, and Gaussian Mixture Models, Unnamed Item, Compressive statistical learning with random feature moments, Moment Estimation for Nonparametric Mixture Models through Implicit Tensor Decomposition, On Best Low Rank Approximation of Positive Definite Tensors, A tensor-EM method for large-scale latent class analysis with binary responses, Mixed membership Gaussians, Convergence rates of latent topic models under relaxed identifiability conditions, A Doubly Enhanced EM Algorithm for Model-Based Tensor Clustering, Recovering Structured Probability Matrices, Barriers for Rank Methods in Arithmetic Complexity, Robust Estimators in High-Dimensions Without the Computational Intractability, Estimating Higher-Order Moments Using Symmetric Tensor Decomposition, Improved convergence guarantees for learning Gaussian mixture models by EM and gradient EM, Eigenvectors of Orthogonally Decomposable Functions, Unnamed Item, Unnamed Item, Robust high-dimensional factor models with applications to statistical machine learning, Moment identifiability of homoscedastic Gaussian mixtures, Statistical convergence of the EM algorithm on Gaussian mixture models, A Model-Based Embedding Technique for Segmenting Customers, Tensor Decompositions for Learning Latent Variable Models (A Survey for ALT), Statistical and Computational Guarantees for the Baum-Welch Algorithm, Structured matrix estimation and completion, Polynomial Learning of Distribution Families, Fast Moment Estimation for Generalized Latent Dirichlet Models, A spectral algorithm for latent Dirichlet allocation, Provable ICA with unknown Gaussian noise, and implications for Gaussian mixtures and autoencoders
Cites Work
- Teachability in computational learning
- Measuring teachability using variants of the teaching dimension
- Teaching a smarter learner.
- Occam's razor
- Pseudorandom generators for space-bounded computation
- On the power of inductive inference from good examples
- A model of interactive teaching
- Learning from different teachers
- On the limits of efficient teachability
- In search of an easy witness: Exponential time vs. probabilistic polynomial time.
- On the complexity of teaching
- On specifying Boolean functions by labelled examples
- Recent Developments in Algorithmic Teaching
- A theory of the learnable
- Teaching Randomized Learners
- Algorithmic Learning Theory
- A theory of goal-oriented communication
- Derandomizing polynomial identity tests means proving circuit lower bounds
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item