Approximation methods for the unconstrained optimization
From MaRDI portal
Publication:1234630
DOI10.1007/BF01083884zbMath0348.90115OpenAlexW2014112378MaRDI QIDQ1234630
V. K. Saul'ev, I. I. Samoilova
Publication date: 1976
Published in: Journal of Soviet Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/bf01083884
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Numerical methods in optimal control (49Mxx)
Cites Work
- Gradient methods of maximization
- Variational methods in problems of control and programming
- Cauchy's method of minimization
- Newton's method for convex programming and Tschebyscheff approximation
- Application of the Monte Carlo method to systems of nonlinear algebraic equations
- Contributions to the theory of the method of steepest descent
- On the asymptotic directions of the s-dimensional optimum gradient method
- An effective algorithm for minimization
- Memory gradient method for the minimization of functions
- Relation between the memory gradient method and the Fletcher-Reeves method
- Study on a supermemory gradient method for the minimization of functions
- Convergence of the conjugate gradient method with computationally convenient modifications
- Unified approach to quadratically convergent algorithms for function minimization
- Numerical experiments on quadratically convergent algorithms for function minimization
- The potential method for conditional maxima in the locally compact metric spaces
- Numerical computational methods of optimisation in control
- An algorithm that minimizes homogeneous functions of \(n\) variables in \(n + 2\) iterations and rapidly minimizes general functions
- Minimization of functions having Lipschitz continuous first partial derivatives
- On variable-metric algorithms
- Properties of the conjugate-gradient and Davidon methods
- A method of unconstrained global optimization
- Méthodes numériques pour la décomposition et la minimisation de fonctions non différentiables
- On the use of generalized inverses in function minimization
- Über Dämpfung bei Minimalisierungsverfahren. (On damping in minimization methods)
- Research on a constrained minimization problem of a function with first and second derivatives
- Comparison of some conjugate direction procedures for function minimization
- Extremwertermittlung mit Funktionswerten bei Funktionen von mehreren Veränderlichen. (Determination of the extremum value by means of function values for functions of seberal variables)
- Über die Konvergenz von Einzelschrittverfahren zur Minimierung konvexer Funktionen. (On convergence of one-step methods for minimizing convex functions.)
- Bemerkungen zum Gradientenverfahren
- Stochastic ascent
- Über einige Methoden der Relaxationsrechnung
- Inversion of Matrices by Biorthogonalization and Related Results
- Ein allgemeines Maximalisierungverfahren
- An Iterative Method for Finding Stationary Values of a Function of Several Variables
- `` Direct Search Solution of Numerical and Statistical Problems
- Optimization Problems: Solution by an Analogue Computer
- The Created Response Surface Technique for Optimizing Nonlinear, Restrained Systems
- An Algorithm for Least-Squares Estimation of Nonlinear Parameters
- A Review of Minimization Techniques for Nonlinear Functions
- Variable Metric Method for Minimization
- A Rank Two Algorithm for Unconstrained Minimization
- Rate of Convergence of Several Conjugate Gradient Algorithms
- A Quasi-Newton Method with No Derivatives
- Some Algorithms for Minimizing a Function of Several Variables
- Minimizing Certain Convex Functions
- A Rapidly Convergent Descent Method for Minimization
- Function minimization by conjugate gradients
- An efficient method for finding the minimum of a function of several variables without calculating derivatives
- Function Minimization Without Evaluating Derivatives--a Review
- Location of the Maximum on Unimodal Surfaces
- Extensions of SUMT for Nonlinear Programming: Equality Constraints and Extrapolation
- A Comparison of Several Current Optimization Methods, and the use of Transformations in Constrained Problems
- An iterative method for locating turning points
- The Conjugate Gradient Method for Linear and Nonlinear Operator Equations
- Quasi-Newton Methods and their Application to Function Minimisation
- Variance algorithm for minimization
- On finding local maxima of functions of a real variable
- On the Relative Efficiencies of Gradient Methods
- An Improved Procedure for Orthogonalising the Search Vectors in Rosenbrock's and Swann's Direct Search Optimisation Methods
- Optimization by Least Squares
- Extended Aitken acceleration
- Two Algorithms Related to the Method of Steepest Descent
- Convergence Conditions for Ascent Methods
- The Slacked Unconstrained Minimization Technique for Convex Programming
- Minimizing a function without calculating derivatives
- A Survey of Numerical Methods for Unconstrained Optimization
- The application of third variations to function minimization
- Global Convergence of Newton–Gauss–Seidel Methods
- Modification of a Quasi-Newton Method for Nonlinear Equations with a Sparse Jacobian
- Computational experience with quadratically convergent minimisation methods
- Comparison of Gradient Methods for the Solution of Nonlinear Parameter Estimation Problems
- A Correction Concerning the Convergence Rate for the Conjugate Gradient Method
- The Convergence of a Class of Double-rank Minimization Algorithms
- A new approach to variable metric algorithms
- The Convergence of Single-Rank Quasi-Newton Methods
- A Family of Gradient Methods for Optimization
- Minimization by Successive Approximation
- On Descent from Local Minima
- A comparison of gradient dependent techniques for the minimization of an unconstrained function of several variables
- A Nongradient and Parallel Algorithm for Unconstrained Minimization
- The Convergence of a Class of Double-rank Minimization Algorithms 1. General Considerations
- A comparison of modified Newton methods for unconstrained optimisation
- Minimization Algorithms Making Use of Non-quadratic Properties of the Objective Function
- The Convergence of an Algorithm for Solving Sparse Nonlinear Systems
- Asymptotic Distribution of Stochastic Approximation Procedures
- A Simplex Method for Function Minimization
- A fifth-order family of modified Newton methods
- The Hydrodynamic Resistance of a Fluid Sphere Submerged in Stokes Flows
- Letter to the Editor—A Monte Carlo Method for the Approximate Solution of Certain Types of Constrained Optimization Problems
- N-step conjugate gradient minimization scheme for nonquadratic functions
- A Modification of Davidon's Minimization Method to Accept Difference Approximations of Derivatives
- Über die Schrittweitenwahl bei Abstiegsverfahren zur Minimierung konvexer Funktionen
- Convergence rate of the gradient descent method with dilatation of the space
- Sur La Méthode De Davidon-Flechter-Powell Pour La Minimisation Des Fonctions
- Parameter Selection for Modified Newton Methods for Function Minimization
- Quasi-Newton Methods for Unconstrained Optimization
- Sequential Application of Simplex Designs in Optimisation and Evolutionary Operation
- Methods of conjugate gradients for solving linear systems
- Stochastic Estimation of the Maximum of a Regression Function
- Sequential Minimax Search for a Maximum
- Approximation Methods which Converge with Probability one
- Point Estimates of Ordinates of Concave Functions
- A study of test functions for optimization algorithms
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item