Pages that link to "Item:Q494332"
From MaRDI portal
The following pages link to Universal gradient methods for convex optimization problems (Q494332):
Displaying 50 items.
- OSGA: a fast subgradient algorithm with optimal complexity (Q304218) (← links)
- On the global convergence rate of the gradient descent method for functions with Hölder continuous gradients (Q315517) (← links)
- New results on subgradient methods for strongly convex optimization problems with a unified analysis (Q316174) (← links)
- Accelerated schemes for a class of variational inequalities (Q1680963) (← links)
- Dual approaches to the minimization of strongly convex functionals with a simple structure under affine constraints (Q1683173) (← links)
- Optimal subgradient algorithms for large-scale convex optimization in simple domains (Q1689457) (← links)
- On the computational efficiency of subgradient methods: a case study with Lagrangian bounds (Q1697974) (← links)
- Accelerated first-order methods for hyperbolic programming (Q1717219) (← links)
- Conditional gradient type methods for composite nonlinear and stochastic optimization (Q1717236) (← links)
- Universal method for stochastic composite optimization problems (Q1746349) (← links)
- Solving structured nonsmooth convex optimization with complexity \(\mathcal {O}(\varepsilon ^{-1/2})\) (Q1752352) (← links)
- Complexity bounds for primal-dual methods minimizing the model of objective function (Q1785201) (← links)
- On the quality of first-order approximation of functions with Hölder continuous gradient (Q1985266) (← links)
- An accelerated directional derivative method for smooth stochastic convex optimization (Q2029381) (← links)
- Nearly optimal first-order methods for convex optimization under gradient norm measure: an adaptive regularization approach (Q2031939) (← links)
- Decentralized and parallel primal and dual accelerated methods for stochastic convex programming problems (Q2042418) (← links)
- Accelerated Bregman proximal gradient methods for relatively smooth convex optimization (Q2044481) (← links)
- A block inertial Bregman proximal algorithm for nonsmooth nonconvex problems with application to symmetric nonnegative matrix tri-factorization (Q2046565) (← links)
- Quasi-convex feasibility problems: subgradient methods and convergence rates (Q2076909) (← links)
- Generalized Nesterov's accelerated proximal gradient algorithms with convergence rate of order \(o(1/k^2)\) (Q2082553) (← links)
- Convex optimization with inexact gradients in Hilbert space and applications to elliptic inverse problems (Q2117629) (← links)
- First-order optimization algorithms via inertial systems with Hessian driven damping (Q2133411) (← links)
- Generalized mirror prox algorithm for monotone variational inequalities: Universality and inexact oracle (Q2159456) (← links)
- Zeroth-order methods for noisy Hölder-gradient functions (Q2162695) (← links)
- Smoothness parameter of power of Euclidean norm (Q2178876) (← links)
- Implementable tensor methods in unconstrained convex optimization (Q2227532) (← links)
- Fast gradient descent for convex minimization problems with an oracle producing a \(( \delta, L)\)-model of function at the requested point (Q2278192) (← links)
- On the properties of the method of minimization for convex functions with relaxation on the distance to extremum (Q2287166) (← links)
- Regularized nonlinear acceleration (Q2288185) (← links)
- Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity (Q2311123) (← links)
- Accelerated primal-dual gradient descent with linesearch for convex, nonconvex, and nonsmooth optimization problems (Q2313241) (← links)
- Universal method of searching for equilibria and stochastic equilibria in transportation networks (Q2314190) (← links)
- Optimal subgradient methods: computational properties for large-scale linear inverse problems (Q2315075) (← links)
- Generalized uniformly optimal methods for nonlinear programming (Q2316202) (← links)
- Efficiency of minimizing compositions of convex functions and smooth maps (Q2330660) (← links)
- An adaptive proximal method for variational inequalities (Q2332639) (← links)
- An optimal subgradient algorithm with subspace search for costly convex optimization problems (Q2415906) (← links)
- Empirical risk minimization: probabilistic complexity and stepsize strategy (Q2419551) (← links)
- Fast gradient methods for uniformly convex and weakly smooth problems (Q2673504) (← links)
- Perturbed Fenchel duality and first-order methods (Q2687051) (← links)
- A simple nearly optimal restart scheme for speeding up first-order methods (Q2696573) (← links)
- The impact of noise on evaluation complexity: the deterministic trust-region case (Q2696963) (← links)
- General Hölder smooth convergence rates follow from specialized rates assuming growth bounds (Q2696991) (← links)
- A family of subgradient-based methods for convex optimization problems in a unifying framework (Q2829570) (← links)
- A subgradient method for free material design (Q2832891) (← links)
- Efficiency of the Accelerated Coordinate Descent Method on Structured Optimization Problems (Q2957979) (← links)
- Non-monotone Behavior of the Heavy Ball Method (Q3296870) (← links)
- (Q3779681) (← links)
- Stochastic Model-Based Minimization of Weakly Convex Functions (Q4620418) (← links)
- Universal Regularization Methods: Varying the Power, the Smoothness and the Accuracy (Q4629334) (← links)