Riemannian Natural Gradient Methods
From MaRDI portal
Publication:6189169
DOI10.1137/22m1509643arXiv2207.07287MaRDI QIDQ6189169
ZaiWen Wen, Minghan Yang, Jiang Hu, Unnamed Author, Anthony Man-Cho So
Publication date: 8 February 2024
Published in: SIAM Journal on Scientific Computing (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2207.07287
manifold optimizationKronecker-factored approximationnatural gradient methodRiemannian Fisher information matrix
Semidefinite programming (90C22) Large-scale problems in mathematical programming (90C06) Nonconvex programming, global optimization (90C26) Derivative-free methods and methods using generalized derivatives (90C56)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Stochastic Quasi-Newton Method for Large-Scale Optimization
- Stochastic optimization using a trust-region method and random models
- Sub-sampled Newton methods
- A stochastic extra-step quasi-Newton method for nonsmooth nonconvex optimization
- Sketch-based empirical natural gradient methods for deep learning
- A brief introduction to manifold optimization
- A Riemannian gossip approach to subspace learning on Grassmann manifold
- Low-rank matrix completion via preconditioned optimization on the Grassmann manifold
- Riemannian stochastic variance-reduced cubic regularized Newton method for submanifold optimization
- Projection-like Retractions on Matrix Manifolds
- Optimization Methods on Riemannian Manifolds and Their Application to Shape Space
- Newton Sketch: A Near Linear-Time Optimization Algorithm with Linear-Quadratic Convergence
- Fast Curvature Matrix-Vector Products for Second-Order Gradient Descent
- Adaptive Quadratically Regularized Newton Method for Riemannian Optimization
- An Introduction to Optimization on Smooth Manifolds
- Stochastic Trust-Region Methods with Trust-Region Radius Depending on Probabilistic Models
- Riemannian Optimization and Its Applications
- Riemannian Stochastic Variance Reduced Gradient Algorithm with Retraction and Vector Transport
- Stochastic Gradient Descent on Riemannian Manifolds
- Covariance, subspace, and intrinsic Crame/spl acute/r-Rao bounds
- Information-Geometric Optimization Algorithms: A Unifying Picture via Invariance Principles
- An Extrinsic Look at the Riemannian Hessian
- Differentiation Under the Integral Sign
- A Stochastic Approximation Method
- Efficient Natural Gradient Descent Methods for Large-Scale PDE-Based Optimization Problems