Riemannian gradient methods for stochastic composition problems
From MaRDI portal
Publication:6488717
DOI10.1016/J.NEUNET.2022.06.004WikidataQ115342702 ScholiaQ115342702MaRDI QIDQ6488717
Publication date: 17 October 2023
Published in: Neural Networks (Search for Journal in Brave)
principal component analysisRiemannian manifoldStiefel manifoldGrassmann manifolddeep neural networkscomposition optimization
Artificial neural networks and deep learning (68T07) Statistics (62-XX) Differential geometry (53-XX)
Cites Work
- Lie-group-type neural system learning by manifold retractions
- Distributionally robust optimization. A review on theory and applications
- Low-Rank Matrix Completion by Riemannian Optimization
- Manopt, a Matlab toolbox for optimization on manifolds
- A Broyden Class of Quasi-Newton Methods for Riemannian Optimization
- Complete Dictionary Recovery Over the Sphere II: Recovery by Riemannian Trust-Region Method
- Empirical Arithmetic Averaging Over the Compact Stiefel Manifold
- Accelerating Stochastic Composition Optimization
- Gradient-based Learning Methods Extended to Smooth Manifolds Applied to Automated Clustering
- Riemannian Stochastic Variance Reduced Gradient Algorithm with Retraction and Vector Transport
- Solving Stochastic Compositional Optimization is Nearly as Easy as Solving Stochastic Optimization
This page was built for publication: Riemannian gradient methods for stochastic composition problems