An adaptive regularized proximal Newton-type methods for composite optimization over the Stiefel manifold
From MaRDI portal
Publication:6624434
DOI10.1007/s10589-024-00595-3MaRDI QIDQ6624434
Publication date: 25 October 2024
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Stiefel manifoldlinear convergencesuperlinear convergenceregularized quasi-Newton methodproximal Newton-type method
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Methods of quasi-Newton type (90C53)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A feasible method for optimization with orthogonality constraints
- \(\varepsilon\)-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds
- A splitting method for orthogonality constrained problems
- Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds
- A nonmonotone conjugate gradient algorithm for unconstrained optimization
- Subgradient algorithm on Riemannian manifolds
- A regularized semi-smooth Newton method with projection steps for composite convex programs
- Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates
- Riemannian proximal gradient methods
- A Riemannian subgradient algorithm for economic dispatch with valve-point effect
- A collection of nonsmooth Riemannian optimization problems
- Quadratic optimization with orthogonality constraint: explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods
- Cubic regularization of Newton method and its global performance
- An inexact Riemannian proximal gradient method
- Projection-like Retractions on Matrix Manifolds
- Proximal Newton-Type Methods for Minimizing Composite Functions
- An Augmented Lagrangian Method for Non-Lipschitz Nonconvex Programming
- Adaptive Quadratically Regularized Newton Method for Riemannian Optimization
- Algorithms for nonlinear constraints that use lagrangian functions
- Accelerated Regularized Newton Methods for Minimizing Composite Convex Functions
- A Nonmonotone Line Search Technique for Newton’s Method
- A Proximal Quasi-Newton Trust-Region Method for Nonsmooth Regularized Optimization
- Compressed modes for variational problems in mathematics and physics
- Proximal Gradient Method for Nonsmooth Optimization over the Stiefel Manifold
- Parallelizable Algorithms for Optimization Problems with Orthogonality Constraints
- Regularized Newton Methods for Minimizing Functions with Hölder Continuous Hessians
- Stochastic Quasi-Newton Methods for Nonconvex Stochastic Optimization
- A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds
- Proximal quasi-Newton method for composite optimization over the Stiefel manifold
- A Riemannian Proximal Newton Method
This page was built for publication: An adaptive regularized proximal Newton-type methods for composite optimization over the Stiefel manifold