Regularized Newton methods for convex minimization problems with singular solutions

From MaRDI portal
Publication:1876588

DOI10.1023/B:COAP.0000026881.96694.32zbMath1056.90111OpenAlexW2004381194MaRDI QIDQ1876588

Masao Fukushima, Nobuo Yamashita, Liqun Qi, Dong-hui Li

Publication date: 20 August 2004

Published in: Computational Optimization and Applications (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1023/b:coap.0000026881.96694.32




Related Items (29)

Truncated regularized Newton method for convex minimizationsA positive spectral gradient-like method for large-scale nonlinear monotone equationsOn the convergence of an inexact Newton-type methodA regularized Newton method for degenerate unconstrained optimization problemsConvergence analysis of a regularized interior point algorithm for the barrier problems with singular solutionsRegularization of limited memory quasi-Newton methods for large-scale nonconvex minimizationNewton-MR: inexact Newton method with minimum residual sub-problem solverRegularized Newton Method with Global \({\boldsymbol{\mathcal{O}(1/{k}^2)}}\) ConvergenceA Regularized Newton Method for \({\boldsymbol{\ell}}_{q}\) -Norm Composite Optimization ProblemsGlobally convergent coderivative-based generalized Newton methods in nonsmooth optimizationOn the Quadratic Convergence of the Cubic Regularization Method under a Local Error Bound ConditionA family of inexact SQA methods for non-smooth convex minimization with provable convergence guarantees based on the Luo-Tseng error bound propertyAn inexact proximal regularization method for unconstrained optimizationSelf-adaptive inexact proximal point methodsA trust region method for optimization problem with singular solutionsA regularized Newton method for monotone nonlinear equations and its applicationConvergence properties of the regularized Newton method for the unconstrained nonconvex optimizationSuperlinear convergence of a Newton-type algorithm for monotone equationsAdjoint-based exact Hessian computationLocal convergence analysis of the Levenberg-Marquardt framework for nonzero-residue nonlinear least-squares problems under an error bound conditionLocal convergence analysis of a primal-dual method for bound-constrained optimization without SOSCA new regularized quasi-Newton method for unconstrained optimizationA regularized Newton method without line search for unconstrained optimizationTrust-region quadratic methods for nonlinear systems of mixed equalities and inequalitiesTwo nonmonotone trust region algorithms based on an improved Newton methodA two-step improved Newton method to solve convex unconstrained optimization problemsEfficient regularized Newton-type algorithm for solving convex optimization problemA Bregman Forward-Backward Linesearch Algorithm for Nonconvex Composite Optimization: Superlinear Convergence to Nonisolated Local MinimaCorrection of trust region method with a new modified Newton method




This page was built for publication: Regularized Newton methods for convex minimization problems with singular solutions