Totally convex functions for fixed points computation and infinite dimensional optimization (Q1575115)
From MaRDI portal
| This is the item page for this Wikibase entity, intended for internal use and editing purposes. Please use this page instead for the normal view: Totally convex functions for fixed points computation and infinite dimensional optimization |
scientific article; zbMATH DE number 1493048
| Language | Label | Description | Also known as |
|---|---|---|---|
| English | Totally convex functions for fixed points computation and infinite dimensional optimization |
scientific article; zbMATH DE number 1493048 |
Statements
Totally convex functions for fixed points computation and infinite dimensional optimization (English)
0 references
20 August 2000
0 references
For a differentiable convex function f, the Bregman distance is \[ D_f(x,y) := f(y) - f(x) + f^\circ (x,x-y), \] where \(f^\circ\) is directional derivative. The function \(f\) is called totally convex if \(\inf\{ D_f(x,y) : \|x-t\|=t\} > 0\) when \(t > 0\). These concepts are applied to find common fixed points for families of operators, using a projection constructed from \(D_f\), in Banach spaces that are not Hilbert. The discussion includes stochastic convex feasibility problems, a proximal point method, and an augmented Lagrangian method for constrained optimization. Convergence proofs are given.
0 references
totally convex
0 references
Bregman distance
0 references
fixed points
0 references
Banach space
0 references
stochastic convex feasibility
0 references
proximal point
0 references
augmented Lagrangian
0 references