Operator-valued formulas for Riemannian gradient and Hessian and families of tractable metrics in Riemannian optimization
DOI10.1007/s10957-023-02242-zzbMath1515.65169arXiv2009.10159MaRDI QIDQ6108977
Publication date: 26 July 2023
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2009.10159
optimizationflag manifoldpositive-definitemachine learningRiemannian HessianStiefelpositive-semidefinite
Numerical optimization and variational techniques (65K10) Learning and adaptive systems in artificial intelligence (68T05) Special Riemannian manifolds (Einstein, Sasakian, etc.) (53C25) Real-valued functions on manifolds (58C05) Sensitivity analysis for optimization problems on manifolds (49Q12) Relations of manifolds and cell complexes with engineering (57Z20) Relations of manifolds and cell complexes with computer and data science (57Z25)
Cites Work
- Unnamed Item
- Unnamed Item
- Minimizing a differentiable function over a differential manifold
- A Lagrangian approach to extremal curves on Stiefel manifolds
- Optimization on flag manifolds
- A survey and comparison of contemporary algorithms for computing the matrix geometric mean
- A Riemannian framework for tensor computing
- Closed-form geodesics and optimization for Riemannian logarithms of Stiefel and flag manifolds
- Riemannian Preconditioning
- Pymanopt: A Python Toolbox for Optimization on Manifolds using Automatic Differentiation
- Manopt, a Matlab toolbox for optimization on manifolds
- Low-Rank Optimization on the Cone of Positive Semidefinite Matrices
- Newton's method on Riemannian manifolds and a geometric model for the human spine
- Riemannian Metric and Geometric Mean for Positive Semidefinite Matrices of Fixed Rank
- The Geometry of Algorithms with Orthogonality Constraints
- How and Why to Solve the Operator Equation AX −XB = Y
- IV.—On Least Squares and Linear Combination of Observations
- A Riemannian geometry with complete geodesics for the set of positive semidefinite matrices of fixed rank
- Conic Geometric Optimization on the Manifold of Positive Definite Matrices
- Optimization algorithms exploiting unitary constraints
- Covariance, subspace, and intrinsic Crame/spl acute/r-Rao bounds
- Algorithm 432 [C2: Solution of the matrix equation AX + XB = C [F4]]
- An Extrinsic Look at the Riemannian Hessian
- Independent Component Analysis and Blind Signal Separation
This page was built for publication: Operator-valued formulas for Riemannian gradient and Hessian and families of tractable metrics in Riemannian optimization