On the use of directions of negative curvature in a modified newton method
From MaRDI portal
Publication:4177356
DOI10.1007/BF01582091zbMath0394.90093OpenAlexW1580352598MaRDI QIDQ4177356
Jorge J. Moré, Danny C. Sorensen
Publication date: 1979
Published in: Mathematical Programming (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/bf01582091
AlgorithmConvergenceDecompositionNonlinear ProgrammingDirections of Negative CurvatureModified Newton Method
Related Items
On Variable-Metric Methods for Sparse Hessians, On practical conditions for the existence and uniqueness of solutions to the general equality quadratic programming problem, Second-order negative-curvature methods for box-constrained and general constrained optimization, An output error model and algorithm for electromagnetic system identification, On the use of a modified Newton method for nonlinear finite element analysis, Zwei trajektorienverfahren zur Lösung nichtlinearer optimierungsaufgaben, A note on the use of vector barrier parameters for interior-point methods, Using improved directions of negative curvature for the solution of bound-constrained nonconvex problems, Curved search methods for unconstrained optimization, Modifications of the Wolfe line search rules to satisfy second-order optimality conditions in unconstrained optimization, Improvements of the Newton-Raphson method, Training multi-layered neural network with a trust-region based algorithm, Detecting negative eigenvalues of exact and approximate Hessian matrices in optimization, Adaptive nonmonotone line search method for unconstrained optimization, Nonmonotone second-order Wolfe's line search method for unconstrained optimization problems, Finding second-order stationary points in constrained minimization: a feasible direction approach, Iterative grossone-based computation of negative curvature directions in large-scale optimization, Exploiting negative curvature in deterministic and stochastic optimization, First-order methods almost always avoid strict saddle points, Using negative curvature in solving nonlinear programs, A derivative-free modified tensor method with curvilinear linesearch for unconstrained nonlinear programming, Minimizing a differentiable function over a differential manifold, Minimization methods for functions on simple sets, Avoiding Modified Matrix Factorizations in Newton-like Methods, Polarity and conjugacy for quadratic hypersurfaces: a unified framework with recent advances, Iterative computation of negative curvature directions in large scale optimization, An unconstrained optimization method using nonmonotone second order Goldstein's line search, Modified Cholesky algorithms: A catalog with new approaches, Nonconvex optimization using negative curvature within a modified linesearch, Combining and scaling descent and negative curvature directions, A curvilinear search algorithm for unconstrained optimization by automatic differentiation, A curvilinear method based on minimal-memory BFGS updates, Curvilinear path steplength algorithms for minimization which use directions of negative curvature, On the final steps of Newton and higher order methods, A decoupled first/second-order steps technique for nonconvex nonlinear unconstrained optimization with improved complexity bounds, A nonmonotone truncated Newton-Krylov method exploiting negative curvature directions, for large scale unconstrained optimization, A second-order globally convergent direct-search method and its worst-case complexity, Nonmonotone curvilinear line search methods for unconstrained optimization, Une methode de gradient conjugue sur des varietes application a certains problemes de valeurs propres non lineaires, Modified Newton methods for solving fully monolithic phase-field quasi-static brittle fracture propagation, Improving directions of negative curvature in an efficient manner, A restricted trust region algorithm for unconstrained optimization, Gradient Descent Only Converges to Minimizers: Non-Isolated Critical Points and Invariant Regions, A dwindling filter line search method for unconstrained optimization, Planar conjugate gradient algorithm for large-scale unconstrained optimization. I: Theory, Planar conjugate gradient algorithm for large-scale unconstrained optimization. II: Application, The application of optimal control methodology to nonlinear programming problems, An interior method for nonconvex semidefinite programs, Extending the Step-Size Restriction for Gradient Descent to Avoid Strict Saddle Points
Cites Work
- A modified Newton method for minimization
- A second-order method for unconstrained optimization
- Minimization of functions having Lipschitz continuous first partial derivatives
- Newton-type methods for unconstrained and linearly constrained optimization
- Some Stable Methods for Calculating Inertia and Solving Symmetric Linear Systems
- A modification of Armijo's step-size rule for negative curvature
- Convergence Conditions for Ascent Methods. II: Some Corrections
- On the reduction of a symmetric matrix to tridiagonal form
- Unnamed Item
- Unnamed Item