Stochastic block-coordinate gradient projection algorithms for submodular maximization (Q1723100)
From MaRDI portal
| This is the item page for this Wikibase entity, intended for internal use and editing purposes. Please use this page instead for the normal view: Stochastic block-coordinate gradient projection algorithms for submodular maximization |
scientific article; zbMATH DE number 7025145
| Language | Label | Description | Also known as |
|---|---|---|---|
| English | Stochastic block-coordinate gradient projection algorithms for submodular maximization |
scientific article; zbMATH DE number 7025145 |
Statements
Stochastic block-coordinate gradient projection algorithms for submodular maximization (English)
0 references
19 February 2019
0 references
Summary: We consider a stochastic continuous submodular huge-scale optimization problem, which arises naturally in many applications such as machine learning. Due to high-dimensional data, the computation of the whole gradient vector can become prohibitively expensive. To reduce the complexity and memory requirements, we propose a stochastic block-coordinate gradient projection algorithm for maximizing continuous submodular functions, which chooses a random subset of gradient vector and updates the estimates along the positive gradient direction. We prove that the estimates of all nodes generated by the algorithm converge to some stationary points with probability 1. Moreover, we show that the proposed algorithm achieves the tight \(((p_{\min}/2)F^\ast -\epsilon)\) approximation guarantee after \(O(1/\epsilon^2)\) iterations for DR-submodular functions by choosing appropriate step sizes. Furthermore, we also show that the algorithm achieves the tight \(((\gamma^2/(1+\gamma^2)) p_{\min} F^\ast -\epsilon)\) approximation guarantee after \(O(1/\epsilon^2)\) iterations for weakly DR-submodular functions with parameter \(\gamma\) by choosing diminishing step sizes.
0 references
0 references
0 references
0 references
0 references
0 references
0 references
0 references