A Newton-type curvilinear search method for optimization
From MaRDI portal
Publication:1227936
DOI10.1016/0022-247X(76)90246-8zbMath0331.49026OpenAlexW2015796677MaRDI QIDQ1227936
C. A. Botsaris, David H. Jacobson
Publication date: 1976
Published in: Journal of Mathematical Analysis and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/0022-247x(76)90246-8
Related Items (19)
Calibration by optimization without using derivatives ⋮ Some effective methods for unconstrained optimization based on the solution of systems of ordinary differential equations ⋮ Linear quadratic dynamic programming for water reservoir management ⋮ An efficient curvilinear method for the minimization of a nonlinear function subject to linear inequality constraints ⋮ A class of differential descent methods for constrained optimization ⋮ Optimal homotopy asymptotic method-least square for solving nonlinear fractional-order gradient-based dynamic system from an optimization problem ⋮ Curvilinear path steplength algorithms for minimization which use directions of negative curvature ⋮ On orthogonal trajectories and optimization ⋮ A curvilinear optimization method based upon iterative estimation of the eigensystem of the Hessian matrix ⋮ A new arc algorithm for unconstrained optimization ⋮ Differential gradient methods ⋮ A class of methods for unconstrained minimization based on stable numerical integration techniques ⋮ A Newton-type curvilinear search method for constrained optimization ⋮ Nonmonotonic reduced projected Hessian method via an affine scaling interior modified gradient path for bounded-constrained optimization ⋮ A new super-memory gradient method with curve search rule ⋮ Adaptive Douglas--Rachford Splitting Algorithm from a Yosida Approximation Standpoint ⋮ Note on global convergence of ODE methods for unconstrained optimization ⋮ On the convergence of curvilinear search algorithms in unconstrained optimization ⋮ K-K-T multiplier estimates and objective function lower bounds from projective SUMT
Cites Work
- An algorithm that minimizes homogeneous functions of \(n\) variables in \(n + 2\) iterations and rapidly minimizes general functions
- A Rapidly Convergent Descent Method for Minimization
- Quasi-Newton Methods and their Application to Function Minimisation
- On the Relative Efficiencies of Gradient Methods
- Unnamed Item
- Unnamed Item
This page was built for publication: A Newton-type curvilinear search method for optimization