Pages that link to "Item:Q4652003"
From MaRDI portal
The following pages link to Prox-Method with Rate of Convergence <i>O</i>(1/<i>t</i>) for Variational Inequalities with Lipschitz Continuous Monotone Operators and Smooth Convex-Concave Saddle Point Problems (Q4652003):
Displaying 50 items.
- Primal-dual subgradient methods for convex problems (Q116219) (← links)
- PPA-like contraction methods for convex optimization: a framework using variational inequality approach (Q259109) (← links)
- Solving variational inequalities with monotone operators on domains given by linear minimization oracles (Q263192) (← links)
- A semi-definite programming approach for robust tracking (Q263222) (← links)
- Sublinear time algorithms for approximate semidefinite programming (Q304246) (← links)
- On the ergodic convergence rates of a first-order primal-dual algorithm (Q312675) (← links)
- Recovery of high-dimensional sparse signals via \(\ell_1\)-minimization (Q364458) (← links)
- Sparse non Gaussian component analysis by semidefinite programming (Q374189) (← links)
- An improved first-order primal-dual algorithm with a new correction step (Q386446) (← links)
- An optimal method for stochastic composite optimization (Q431018) (← links)
- An implementable proximal point algorithmic framework for nuclear norm minimization (Q431025) (← links)
- On the \(O(1/t)\) convergence rate of the projection and contraction methods for variational inequalities with Lipschitz continuous monotone operators (Q461439) (← links)
- Inexact alternating-direction-based contraction methods for separable linearly constrained convex optimization (Q467471) (← links)
- An alternating extragradient method with non Euclidean projections for saddle point problems (Q480931) (← links)
- Dual subgradient algorithms for large-scale nonsmooth learning problems (Q484132) (← links)
- Adaptive inexact fast augmented Lagrangian methods for constrained convex optimization (Q519779) (← links)
- An extragradient-based alternating direction method for convex minimization (Q525598) (← links)
- Korpelevich's method for variational inequality problems in Banach spaces (Q539500) (← links)
- Estimation of high-dimensional low-rank matrices (Q548539) (← links)
- Approximation accuracy, gradient methods, and error bound for structured convex optimization (Q607498) (← links)
- Primal-dual first-order methods with \({\mathcal {O}(1/\varepsilon)}\) iteration-complexity for cone programming (Q623454) (← links)
- On verifiable sufficient conditions for sparse signal recovery via \(\ell_{1}\) minimization (Q633105) (← links)
- Barrier subgradient method (Q633113) (← links)
- A version of the mirror descent method to solve variational inequalities (Q681901) (← links)
- Self-concordant barriers for convex approximations of structured convex sets (Q707744) (← links)
- The generalized proximal point algorithm with step size 2 is not necessarily convergent (Q721955) (← links)
- A primal-dual prediction-correction algorithm for saddle point optimization (Q727398) (← links)
- On the convergence rate of Douglas-Rachford operator splitting method (Q747781) (← links)
- On the resolution of misspecified convex optimization and monotone variational inequality problems (Q782913) (← links)
- Large-scale semidefinite programming via a saddle point mirror-prox algorithm (Q868467) (← links)
- Dual extrapolation and its applications to solving variational inequalities and related problems (Q868471) (← links)
- On the convergence rate of a class of proximal-based decomposition methods for monotone variational inequalities (Q888313) (← links)
- Subgradient methods for saddle-point problems (Q1035898) (← links)
- Stochastic mirror descent dynamics and their convergence in monotone variational inequalities (Q1626529) (← links)
- A simplified view of first order methods for optimization (Q1650767) (← links)
- On the information-adaptive variants of the ADMM: an iteration complexity perspective (Q1668725) (← links)
- Bounded perturbation resilience of extragradient-type methods and their applications (Q1677995) (← links)
- Accelerated schemes for a class of variational inequalities (Q1680963) (← links)
- On stochastic mirror-prox algorithms for stochastic Cartesian variational inequalities: randomized block coordinate and optimal averaging schemes (Q1711086) (← links)
- A cyclic block coordinate descent method with generalized gradient projections (Q1733536) (← links)
- Level-set methods for convex optimization (Q1739042) (← links)
- On the optimal linear convergence rate of a generalized proximal point algorithm (Q1742669) (← links)
- An optimal randomized incremental gradient method (Q1785198) (← links)
- A simple algorithm for a class of nonsmooth convex-concave saddle-point problems (Q1785640) (← links)
- A first-order primal-dual algorithm for convex problems with applications to imaging (Q1932848) (← links)
- Accelerated linearized Bregman method (Q1945379) (← links)
- A double smoothing technique for solving unconstrained nondifferentiable convex optimization problems (Q1946618) (← links)
- Iteration-complexity of first-order penalty methods for convex programming (Q1949272) (← links)
- Nonsymmetric proximal point algorithm with moving proximal centers for variational inequalities: convergence analysis (Q2010228) (← links)
- Dynamic stochastic approximation for multi-stage stochastic optimization (Q2020613) (← links)