On convergence of the stochastic subgradient method with on-line stepsize rules (Q1083369)
From MaRDI portal
| This is the item page for this Wikibase entity, intended for internal use and editing purposes. Please use this page instead for the normal view: On convergence of the stochastic subgradient method with on-line stepsize rules |
scientific article; zbMATH DE number 3976786
| Language | Label | Description | Also known as |
|---|---|---|---|
| English | On convergence of the stochastic subgradient method with on-line stepsize rules |
scientific article; zbMATH DE number 3976786 |
Statements
On convergence of the stochastic subgradient method with on-line stepsize rules (English)
0 references
1986
0 references
The stochastic subgradient method for solving convex programming problems is considered in the paper. More precisely the authors deal with the problem of determining the stepsize coefficients in the classical stochastic subgradient methods under the assumptions that neither the values nor the subgradients of the optimized function are available. First, the authors follow some former papers on this topic. Further, they analyse the properties of the method with special stepsize coefficients. A convergence theorem is introduced and proved in the paper.
0 references
stochastic subgradient method
0 references
special stepsize coefficients
0 references
convergence theorem
0 references