Differential gradient methods
From MaRDI portal
Publication:1257325
DOI10.1016/0022-247X(78)90114-2zbMath0405.65040MaRDI QIDQ1257325
Publication date: 1978
Published in: Journal of Mathematical Analysis and Applications (Search for Journal in Brave)
AlgorithmsFunction MinimizationTest FunctionsDifferential Descent MethodsGeneral Curvilinear Search PathInitial-Value Systems of Differential EquationsQuadratic Function
Numerical optimization and variational techniques (65K10) Numerical methods for initial value problems involving ordinary differential equations (65L05) Mathematical programming (90C99)
Related Items (17)
Some effective methods for unconstrained optimization based on the solution of systems of ordinary differential equations ⋮ An efficient curvilinear method for the minimization of a nonlinear function subject to linear inequality constraints ⋮ A comparison of methods for traversing regions of non-convexity in optimization problems ⋮ A class of differential descent methods for constrained optimization ⋮ Constrained optimization along geodesics ⋮ Optimal homotopy asymptotic method-least square for solving nonlinear fractional-order gradient-based dynamic system from an optimization problem ⋮ A new descent algorithm with curve search rule ⋮ Local convergence of the steepest descent method in Hilbert spaces ⋮ A trajectory-based method for constrained nonlinear optimization problems ⋮ A curvilinear optimization method based upon iterative estimation of the eigensystem of the Hessian matrix ⋮ A class of methods for unconstrained minimization based on stable numerical integration techniques ⋮ A Newton-type curvilinear search method for constrained optimization ⋮ Constrained optimization: Projected gradient flows ⋮ A new super-memory gradient method with curve search rule ⋮ A general method for solving constrained optimization problems ⋮ Note on global convergence of ODE methods for unconstrained optimization ⋮ K-K-T multiplier estimates and objective function lower bounds from projective SUMT
Cites Work
- A class of differential descent methods for constrained optimization
- A Newton-type curvilinear search method for optimization
- An algorithm that minimizes homogeneous functions of \(n\) variables in \(n + 2\) iterations and rapidly minimizes general functions
- A numerically stable optimization method based on A homogeneous function
- A Rapidly Convergent Descent Method for Minimization
- Quasi-Newton Methods and their Application to Function Minimisation
- A new approach to variable metric algorithms
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: Differential gradient methods