On the complexity of solving feasibility problems with regularized models
From MaRDI portal
Publication:5038424
DOI10.1080/10556788.2020.1786564zbMath1501.90092OpenAlexW3043378690MaRDI QIDQ5038424
Ernesto G. Birgin, Luís Felipe Bueno, José Mario Martínez
Publication date: 30 September 2022
Published in: Optimization Methods and Software (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/10556788.2020.1786564
Analysis of algorithms and problem complexity (68Q25) Numerical mathematical programming methods (65K05) Abstract computational complexity for mathematical programming problems (90C60) Nonlinear programming (90C30) Numerical methods based on nonlinear programming (49M37)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- On the global convergence rate of the gradient descent method for functions with Hölder continuous gradients
- Worst-case evaluation complexity for unconstrained nonlinear optimization using high-order regularized models
- Adaptive cubic regularisation methods for unconstrained optimization. I: Motivation, convergence and numerical results
- Adaptive cubic regularisation methods for unconstrained optimization. II: Worst-case function- and derivative-evaluation complexity
- CUTEst: a constrained and unconstrained testing environment with safe threads for mathematical optimization
- Cubic regularization of Newton method and its global performance
- On the Evaluation Complexity of Cubic Regularization Methods for Potentially Rank-Deficient Nonlinear Least-Squares Problems and Its Relevance to Constrained Nonlinear Optimization
- A Linearly Convergent Algorithm for Solving a Class of Nonconvex/Affine Feasibility Problems
- DOUGLAS–RACHFORD FEASIBILITY METHODS FOR MATRIX COMPLETION PROBLEMS
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- On Augmented Lagrangian Methods with General Lower-Level Constraints
- Two-Point Step Size Gradient Methods
- Nonmonotone Spectral Projected Gradient Methods on Convex Sets
- On High-order Model Regularization for Constrained Optimization
- Universal Regularization Methods: Varying the Power, the Smoothness and the Accuracy
- Accelerated Regularized Newton Methods for Minimizing Composite Convex Functions
- Practical Augmented Lagrangian Methods for Constrained Optimization
- Regularized Newton Methods for Minimizing Functions with Hölder Continuous Hessians
- Complexity and performance of an Augmented Lagrangian algorithm