Convex optimization: algorithms and complexity (Q2809807)
From MaRDI portal
| This is the item page for this Wikibase entity, intended for internal use and editing purposes. Please use this page instead for the normal view: scientific article |
scientific article; zbMATH DE number 6587539
| Language | Label | Description | Also known as |
|---|---|---|---|
| English | Convex optimization: algorithms and complexity |
scientific article; zbMATH DE number 6587539 |
Statements
30 May 2016
0 references
convex optimization
0 references
black-box model
0 references
stochastic optimization
0 references
numerical methods
0 references
Convex optimization: algorithms and complexity (English)
0 references
This monographs is based on the lectures given in Princeton University in 2013 and 2014 and presents the main complexity theorems in convex optimization and their corresponding algorithms. Starting from the fundamental theory of black-box optimization, the material progresses towards recent advances in structural optimization and stochastic optimization. The presentation of black-box optimization includes the analysis of cutting plane methods, as well as gradient descent schemes. The author pays special attention to non-Euclidean settings and discusses their relevance in machine learning. A gentle introduction to structural optimization, saddle-point mirror prox (Nemirovski's alternative to Nesterov's smoothing), and a concise description of interior point methods has been provided in the book. Stochastic gradient descent, mini-batches, random coordinate descent, and sublinear algorithms have been discussed in stochastic optimization. At the end, the author briefly touches upon convex relaxation of combinatorial problems and the use of randomness to round solutions, as well as random walks based methods.
0 references