Efficiency of higher-order algorithms for minimizing composite functions
From MaRDI portal
Publication:6155068
DOI10.1007/s10589-023-00533-9arXiv2203.13367MaRDI QIDQ6155068
Publication date: 16 February 2024
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2203.13367
convergence rateshigher-order methodscomposite optimizationKurdyka-Lojasiewicz property(non)convex minimization
Cites Work
- Unnamed Item
- Unnamed Item
- Smooth minimization of non-smooth functions
- Gradient methods for minimizing composite functions
- Proximal alternating linearized minimization for nonconvex and nonsmooth problems
- Worst-case evaluation complexity for unconstrained nonlinear optimization using high-order regularized models
- On the convergence of the proximal algorithm for nonsmooth functions involving analytic features
- The value function approach to convergence analysis in composite optimization
- On the use of third-order models with fourth-order regularization for unconstrained optimization
- The multiproximal linearization method for convex composite problems
- Implementable tensor methods in unconstrained convex optimization
- Efficiency of minimizing compositions of convex functions and smooth maps
- Linear convergence of first order methods for non-strongly convex optimization
- Convergence and evaluation-complexity analysis of a regularized tensor-Newton method for solving nonlinear least-squares problems
- On the implementation of an interior-point filter line-search algorithm for large-scale nonlinear programming
- Cubic regularization of Newton method and its global performance
- Clarke Subgradients of Stratifiable Functions
- Majorizing Functions and Convergence of the Gauss–Newton Method for Convex Composite Optimization
- Conditions for convergence of trust region algorithms for nonsmooth optimization
- Testing Unconstrained Optimization Software
- A model algorithm for composite nondifferentiable optimization problems
- Tensor Methods for Minimizing Convex Functions with Hölder Continuous Higher-Order Derivatives
- Inexact basic tensor methods for some classes of convex optimization problems
- A concise second-order complexity analysis for unconstrained optimization using high-order regularized models
- Incremental Majorization-Minimization Optimization with Application to Large-Scale Machine Learning
- High-Order Optimization Methods for Fully Composite Problems