Regularized Newton methods for convex minimization problems with singular solutions
From MaRDI portal
Publication:1876588
DOI10.1023/B:COAP.0000026881.96694.32zbMath1056.90111OpenAlexW2004381194MaRDI QIDQ1876588
Masao Fukushima, Nobuo Yamashita, Liqun Qi, Dong-hui Li
Publication date: 20 August 2004
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1023/b:coap.0000026881.96694.32
Related Items (29)
Truncated regularized Newton method for convex minimizations ⋮ A positive spectral gradient-like method for large-scale nonlinear monotone equations ⋮ On the convergence of an inexact Newton-type method ⋮ A regularized Newton method for degenerate unconstrained optimization problems ⋮ Convergence analysis of a regularized interior point algorithm for the barrier problems with singular solutions ⋮ Regularization of limited memory quasi-Newton methods for large-scale nonconvex minimization ⋮ Newton-MR: inexact Newton method with minimum residual sub-problem solver ⋮ Regularized Newton Method with Global \({\boldsymbol{\mathcal{O}(1/{k}^2)}}\) Convergence ⋮ A Regularized Newton Method for \({\boldsymbol{\ell}}_{q}\) -Norm Composite Optimization Problems ⋮ Globally convergent coderivative-based generalized Newton methods in nonsmooth optimization ⋮ On the Quadratic Convergence of the Cubic Regularization Method under a Local Error Bound Condition ⋮ A family of inexact SQA methods for non-smooth convex minimization with provable convergence guarantees based on the Luo-Tseng error bound property ⋮ An inexact proximal regularization method for unconstrained optimization ⋮ Self-adaptive inexact proximal point methods ⋮ A trust region method for optimization problem with singular solutions ⋮ A regularized Newton method for monotone nonlinear equations and its application ⋮ Convergence properties of the regularized Newton method for the unconstrained nonconvex optimization ⋮ Superlinear convergence of a Newton-type algorithm for monotone equations ⋮ Adjoint-based exact Hessian computation ⋮ Local convergence analysis of the Levenberg-Marquardt framework for nonzero-residue nonlinear least-squares problems under an error bound condition ⋮ Local convergence analysis of a primal-dual method for bound-constrained optimization without SOSC ⋮ A new regularized quasi-Newton method for unconstrained optimization ⋮ A regularized Newton method without line search for unconstrained optimization ⋮ Trust-region quadratic methods for nonlinear systems of mixed equalities and inequalities ⋮ Two nonmonotone trust region algorithms based on an improved Newton method ⋮ A two-step improved Newton method to solve convex unconstrained optimization problems ⋮ Efficient regularized Newton-type algorithm for solving convex optimization problem ⋮ A Bregman Forward-Backward Linesearch Algorithm for Nonconvex Composite Optimization: Superlinear Convergence to Nonisolated Local Minima ⋮ Correction of trust region method with a new modified Newton method
This page was built for publication: Regularized Newton methods for convex minimization problems with singular solutions