High-Order Optimization Methods for Fully Composite Problems
From MaRDI portal
Publication:5869820
DOI10.1137/21M1410063MaRDI QIDQ5869820
Nikita Doikov, Yu. E. Nesterov
Publication date: 29 September 2022
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2103.12632
nonsmooth optimizationconvex optimizationconstrained optimizationgradient methodshigh-order methodsaccelerated algorithms
Numerical mathematical programming methods (65K05) Convex programming (90C25) Nonlinear programming (90C30) Newton-type methods (49M15)
Related Items
Regularized Newton Method with Global \({\boldsymbol{\mathcal{O}(1/{k}^2)}}\) Convergence, Efficiency of higher-order algorithms for minimizing composite functions
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A proximal method for composite minimization
- Gradient methods for minimizing composite functions
- Optimality conditions for non-finite valued convex composite functions
- Lectures on convex optimization
- Accelerating the cubic regularization of Newton's method on convex problems
- Global convergence rate analysis of unconstrained optimization methods based on probabilistic models
- Complexity bounds for primal-dual methods minimizing the model of objective function
- A Gauss-Newton method for convex composite optimization
- Minimizing uniformly convex functions by cubic regularization of Newton method
- Smoothness parameter of power of Euclidean norm
- The multiproximal linearization method for convex composite problems
- Implementable tensor methods in unconstrained convex optimization
- Oracle complexity of second-order methods for smooth convex optimization
- Efficiency of minimizing compositions of convex functions and smooth maps
- New constraint qualification and conjugate duality for composed convex optimization problems
- Cubic regularization of Newton method and its global performance
- Farkas-type results for inequality systems with composed convex functions via conjugate duality
- An Accelerated Hybrid Proximal Extragradient Method for Convex Optimization and Its Implications to Second-Order Methods
- Epi-convergent Smoothing with Applications to Convex Composite Functions
- A Moving Balls Approximation Method for a Class of Smooth Constrained Minimization Problems
- On the Evaluation Complexity of Composite Function Minimization with Applications to Nonconvex Nonlinear Programming
- Strong Metric (Sub)regularity of Karush–Kuhn–Tucker Mappings for Piecewise Linear-Quadratic Convex-Composite Optimization and the Quadratic Convergence of Newton’s Method
- A new constraint qualification for the formula of the subdifferential of composed convex functions in infinite dimensional spaces
- Generalized Moreau–Rockafellar results for composed convex functions
- Descent methods for composite nondifferentiable optimization problems
- Second order necessary and sufficient conditions for convex composite NDO
- New Proximal Point Algorithms for Convex Minimization
- Algorithms for nonlinear constraints that use lagrangian functions
- Composite Difference-Max Programs for Modern Statistical Estimation Problems
- Tensor Methods for Minimizing Convex Functions with Hölder Continuous Higher-Order Derivatives
- A Study of Convex Convex-Composite Functions via Infimal Convolution with Applications
- Contracting Proximal Methods for Smooth Convex Optimization
- Error Bounds, Quadratic Growth, and Linear Convergence of Proximal Methods
- Modified Gauss–Newton scheme with worst case guarantees for global performance
- Regularized Newton Methods for Minimizing Functions with Hölder Continuous Hessians