On the global optimization properties of finite-difference local descent algorithms (Q1207044)
From MaRDI portal
| This is the item page for this Wikibase entity, intended for internal use and editing purposes. Please use this page instead for the normal view: On the global optimization properties of finite-difference local descent algorithms |
scientific article; zbMATH DE number 151887
| Language | Label | Description | Also known as |
|---|---|---|---|
| English | On the global optimization properties of finite-difference local descent algorithms |
scientific article; zbMATH DE number 151887 |
Statements
On the global optimization properties of finite-difference local descent algorithms (English)
0 references
4 May 1993
0 references
The global optimization problem (1) \(F(x)\to \inf_{x\in E_ k}\) with \(E_ k\) a \(k\)-dimensional Euclidean space is investigated for (1) \(\gamma\)-convex structured, that is there exists a strongly convex (with parameter \(\ell>0\)) and differentiable function \(\Phi(\cdot)\), \(\nabla\phi(\cdot)\) Lipschitzian with parameter \(L\), such that \(|\Phi(x)- F(x)|\leq \gamma\), \(\forall x\in E_ k\). The function \(\Phi(\cdot)\) is called strongly convex in \(E_ k\) with parameter \(\ell>0\), if \[ \Phi(\lambda x_ 1+ (1-\lambda)x_ 2)\leq \lambda \Phi(x_ 1)+ (1-\lambda)\Phi(x_ 2)- \ell\lambda(1-\lambda)\| x_ 1- x_ 2\|^ 2/2 \] for any \(x_ 1,x_ 2\in E_ k\) and \(\lambda\), \(0\leq \lambda\leq 1\). The author considers first a perturbed gradient descent method to approximating the global minimum of \(\Phi(x)\), then based on that introduces a finite difference algorithm aimed to solve approximately (1). Following the same approach he investigates the convergence properties of a coordinate descent method.
0 references
global optimization
0 references
perturbed gradient descent method
0 references
0 references
0 references
0.8883464
0 references
0.88043404
0 references
0.8795435
0 references
0.87910575
0 references
0.87824136
0 references
0.87792623
0 references