Provable ICA with unknown Gaussian noise, and implications for Gaussian mixtures and autoencoders
From MaRDI portal
Publication:2345948
DOI10.1007/s00453-015-9972-2zbMath1333.68224arXiv1206.5349OpenAlexW2139158168MaRDI QIDQ2345948
Rong Ge, Ankur Moitra, Sushant Sachdeva
Publication date: 21 May 2015
Published in: Algorithmica (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1206.5349
Factor analysis and principal components; correspondence analysis (62H25) Database theory (68P15) Learning and adaptive systems in artificial intelligence (68T05)
Related Items (6)
Statistical Methods for Minimax Estimation in Linear Models with Unknown Design Over Finite Alphabets ⋮ Free component analysis: theory, algorithms and applications ⋮ Robust Estimators in High-Dimensions Without the Computational Intractability ⋮ Eigenvectors of Orthogonally Decomposable Functions ⋮ Multiscale blind source separation ⋮ A spectral algorithm for latent Dirichlet allocation
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Projection pursuit
- Independent component analysis, a new concept?
- Blind source separation via the second characteristic function.
- Learning mixtures of separated nonspherical Gaussians
- Robust blind source separation algorithms using cumulants.
- Efficiently learning mixtures of two Gaussians
- Learning mixtures of spherical gaussians
- Reducing the Dimensionality of Data with Neural Networks
- Learning Deep Architectures for AI
- Fourth-Order Cumulant-Based Blind Identification of Underdetermined Mixtures
- Smoothed analysis of tensor decompositions
- Polynomial Learning of Distribution Families
This page was built for publication: Provable ICA with unknown Gaussian noise, and implications for Gaussian mixtures and autoencoders