On large-scale unconstrained optimization and arbitrary regularization
From MaRDI portal
Publication:2070329
DOI10.1007/s10589-021-00322-2zbMath1484.90114OpenAlexW3210593945MaRDI QIDQ2070329
L. T. Santos, José Mario Martínez
Publication date: 24 January 2022
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10589-021-00322-2
Uses Software
Cites Work
- Unnamed Item
- A trust region algorithm with a worst-case iteration complexity of \(\mathcal{O}(\epsilon ^{-3/2})\) for nonconvex optimization
- Worst-case evaluation complexity for unconstrained nonlinear optimization using high-order regularized models
- Adaptive cubic regularisation methods for unconstrained optimization. I: Motivation, convergence and numerical results
- Adaptive cubic regularisation methods for unconstrained optimization. II: Worst-case function- and derivative-evaluation complexity
- Cubic-regularization counterpart of a variable-norm trust-region method for unconstrained minimization
- On the use of the energy norm in trust-region and adaptive cubic regularization subproblems
- A line-search algorithm inspired by the adaptive cubic regularization framework and complexity analysis
- On the use of third-order models with fourth-order regularization for unconstrained optimization
- A Newton-like method with mixed factorizations and cubic regularization for unconstrained minimization
- Cubic regularization of Newton method and its global performance
- On the Complexity of Steepest Descent, Newton's and Regularized Newton's Methods for Nonconvex Unconstrained Optimization Problems
- Testing Unconstrained Optimization Software
- Inexact spectral projected gradient methods on convex sets
- Nonmonotone Spectral Projected Gradient Methods on Convex Sets
- On High-order Model Regularization for Constrained Optimization
- Universal Regularization Methods: Varying the Power, the Smoothness and the Accuracy
- On Regularization and Active-set Methods with Complexity for Constrained Optimization
- Optimization Methods for Large-Scale Machine Learning
- A Nonmonotone Line Search Technique for Newton’s Method
- The Use of Quadratic Regularization with a Cubic Descent Condition for Unconstrained Optimization
- A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search
- Regularized Newton Methods for Minimizing Functions with Hölder Continuous Hessians
- A spectral conjugate gradient method for unconstrained optimization