Unified approach to quadratically convergent algorithms for function minimization
From MaRDI portal
Publication:2535817
DOI10.1007/BF00927440zbMath0184.20202OpenAlexW1966539012MaRDI QIDQ2535817
Publication date: 1970
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/bf00927440
Related Items
Unnamed Item, A note on memory-less SR1 and memory-less BFGS methods for large-scale unconstrained optimization, On Sparse and Symmetric Matrix Updating Subject to a Linear Equation, Some Numerical Results Using a Sparse Matrix Updating Formula in Unconstrained Optimization, Local and superlinear convergence of quasi-Newton methods based on modified secant conditions, A class of one parameter conjugate gradient methods, Least-squares solution of \(F=PG\) over positive semidefinite symmetric \(P\), One class of dual matrix methods, A double parameter self-scaling memoryless BFGS method for unconstrained optimization, A Characterization of Superlinear Convergence and Its Application to Quasi-Newton Methods, Improved Hessian approximation with modified secant equations for symmetric rank-one method, Quasi Newton techniques generate identical points II: The proofs of four new theorems, Suboptimal explicit receding horizon control via approximate multiparametric quadratic pro\-gramming, Generalized conjugate directions for unconstrained function minimization, Conditions for variable-metric algorithms to be conjugate-gradient algorithms, Cubic regularization in symmetric rank-1 quasi-Newton methods, Eigenvalues and switching algorithms for Quasi-Newton updates, An approach for analyzing the global rate of convergence of quasi-Newton and truncated-Newton methods, On exact linesearch quasi-Newton methods for minimizing a quadratic function, A rational gradient model for minimization, On the Local and Superlinear Convergence of a Parameterized DFP Method, A class of quadratically convergent algorithms for constrained function minimization, On the Huang class of variable metric methods, Two new conjugate gradient methods based on modified secant equations, Variable metric methods in Hilbert space with applications to control problems, Some notes on the quasi-Newton methods, Numerical experiments on DFP-method, a powerful function minimization technique, A variable metric-method for function minimization derived from invariancy to nonlinear scaling, Direct-prediction quasi-Newton methods in Hilbert space with applications to control problems, Some investigations about a unified approach to quadratically convergent algorithms for function minimization, Direct prediction methods in Hilbert space with applications to control problems, Approximation methods for the unconstrained optimization, Acoustic full-waveform inversion and its uncertainty estimation based on a vector-version square-root variable metric method, Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations, Unified approach to unconstrained minimization via basic matrix factorizations, Planar methods and grossone for the conjugate gradient breakdown in nonlinear programming, Exact linesearch limited-memory quasi-Newton methods for minimizing a quadratic function, Numerical experiments on quadratically convergent algorithms for function minimization, On variable-metric algorithms, Self-Scaling Variable Metric Algorithms without Line Search for Unconstrained Minimization, Optimization of large-scale complex systems, Variable metric algorithms: Necessary and sufficient conditions for identical behaviour of nonquadratic functions, Unconstrained approach to the extremization of constrained functions, Quadratically convergent algorithms and one-dimensional search schemes, Stability of Huang's update for the conjugate gradient method, On the uniqueness of search directions in variable metric algorithms, New approach to comparison of search methods used in nonlinear programming problems, Method of dual matrices for function minimization, Numerical experiments on dual matrix algorithms for function minimization, Nonlinear hybrid procedures and fixed point iterations, On the connection between the conjugate gradient method and quasi-Newton methods on quadratic problems, Variable metric methods for unconstrained optimization and nonlinear least squares, A parallel unconstrained quasi-Newton algorithm and its performance on a local memory parallel computer, A Sparse Quasi-Newton Update Derived Variationally with a Nondiagonally Weighted Frobenius Norm
Cites Work
- Unnamed Item
- Unnamed Item
- Memory gradient method for the minimization of functions
- Relation between the memory gradient method and the Fletcher-Reeves method
- Variable Metric Method for Minimization
- A Rapidly Convergent Descent Method for Minimization
- Function minimization by conjugate gradients
- Methods of conjugate gradients for solving linear systems