Stochastic Gauss-Newton algorithms for online PCA
From MaRDI portal
Publication:6111665
DOI10.1007/s10915-023-02289-0zbMath1520.65021arXiv2203.13081MaRDI QIDQ6111665
Publication date: 4 August 2023
Published in: Journal of Scientific Computing (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2203.13081
Factor analysis and principal components; correspondence analysis (62H25) Numerical computation of eigenvalues and eigenvectors of matrices (65F15) Probabilistic models, generic numerical methods in probability and statistics (65C20) Eigenvalues, singular values, and eigenvectors (15A18)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Near-optimal stochastic approximation for online principal component estimation
- Convergence analysis of Oja's iteration for solving online PCA with nonzero-mean samples
- Global convergence of Oja's PCA learning algorithm with a non-zero-approaching adaptive learning rate
- A simplified neuron model as a principal component analyzer
- A Davidson program for finding a few selected extreme eigenpairs of a large, sparse, real, symmetric matrix
- Principal component analysis.
- Semigroups of stochastic gradient descent and online principal component analysis: properties and diffusion approximations
- Efficiency of minimizing compositions of convex functions and smooth maps
- Trace-penalty minimization for large-scale eigenspace computation
- Minimax sparse principal subspace estimation in high dimensions
- Simultaneous iteration method for symmetric matrices
- Toward the Optimal Preconditioned Eigensolver: Locally Optimal Block Preconditioned Conjugate Gradient Method
- Limited Memory Block Krylov Subspace Optimization for Computing Dominant Singular Value Decompositions
- Principal component analysis: a review and recent developments
- On the Rates of Convergence of the Lanczos and the Block-Lanczos Methods
- A Jacobi--Davidson Iteration Method for Linear Eigenvalue Problems
- GNMR: A Provable One-Line Algorithm for Low Rank Matrix Recovery
- Parallelizable Algorithms for Optimization Problems with Orthogonality Constraints
- Accelerating Convergence by Augmented Rayleigh--Ritz Projections For Large-Scale Eigenpair Computation
- An Introduction to Matrix Concentration Inequalities
- An Efficient Gauss--Newton Algorithm for Symmetric Low-Rank Product Matrix Approximations
- The method of stochastic approximation for the determination of the least eigenvalue of a symmetrical matrix
- A Stochastic Approximation Method
- Online Principal Component Analysis in High Dimension: Which Algorithm to Choose?
This page was built for publication: Stochastic Gauss-Newton algorithms for online PCA