Convergence and evaluation-complexity analysis of a regularized tensor-Newton method for solving nonlinear least-squares problems
From MaRDI portal
Publication:2419539
DOI10.1007/s10589-019-00064-2zbMath1435.90100OpenAlexW2779750444MaRDI QIDQ2419539
Nicholas I. M. Gould, Jennifer Scott, Tyrone Rees
Publication date: 13 June 2019
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: http://centaur.reading.ac.uk/82226/1/theory_v2.pdf
Related Items
A Stochastic Levenberg--Marquardt Method Using Random Models with Complexity Results, Efficiency of higher-order algorithms for minimizing composite functions, The evaluation complexity of finding high-order minimizers of nonconvex optimization, Universal Regularization Methods: Varying the Power, the Smoothness and the Accuracy, An adaptive high order method for finding third-order critical points of nonconvex optimization
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- On the local convergence of a derivative-free algorithm for least-squares minimization
- A trust region algorithm with a worst-case iteration complexity of \(\mathcal{O}(\epsilon ^{-3/2})\) for nonconvex optimization
- Worst-case evaluation complexity for unconstrained nonlinear optimization using high-order regularized models
- Adaptive cubic regularisation methods for unconstrained optimization. I: Motivation, convergence and numerical results
- Adaptive cubic regularisation methods for unconstrained optimization. II: Worst-case function- and derivative-evaluation complexity
- Introductory lectures on convex optimization. A basic course.
- Concise complexity analyses for trust region methods
- Evaluation complexity bounds for smooth constrained nonlinear optimization using scaled KKT conditions and high-order models
- Cubic regularization of Newton method and its global performance
- On the Evaluation Complexity of Cubic Regularization Methods for Potentially Rank-Deficient Nonlinear Least-Squares Problems and Its Relevance to Constrained Nonlinear Optimization
- On the Complexity of Steepest Descent, Newton's and Regularized Newton's Methods for Nonconvex Unconstrained Optimization Problems
- A Derivative-Free Algorithm for Least-Squares Minimization
- Hessian Matrix vs. Gauss–Newton Hessian Matrix
- An Algorithm for Least-Squares Estimation of Nonlinear Parameters
- An Adaptive Nonlinear Least-Squares Algorithm
- Algorithms for the Solution of the Nonlinear Least-Squares Problem
- Algorithm 768: TENSOLVE
- Tensor Methods for Large, Sparse Nonlinear Least Squares Problems
- Universal Regularization Methods: Varying the Power, the Smoothness and the Accuracy
- The Use of Quadratic Regularization with a Cubic Descent Condition for Unconstrained Optimization
- GALAHAD, a library of thread-safe Fortran 90 packages for large-scale nonlinear optimization
- Regularized Newton Methods for Minimizing Functions with Hölder Continuous Hessians
- A method for the solution of certain non-linear problems in least squares