Optimal subgradient methods: computational properties for large-scale linear inverse problems
From MaRDI portal
Publication:2315075
DOI10.1007/s11081-018-9378-5zbMath1418.90190OpenAlexW2791720296MaRDI QIDQ2315075
Publication date: 31 July 2019
Published in: Optimization and Engineering (Search for Journal in Brave)
Full work available at URL: https://phaidra.univie.ac.at/o:937301
high-dimensional datasubgradient methodslinear inverse problemsstructured convex optimizationoptimal complexityfirst-order informationsparse nonsmooth optimization
Related Items (7)
An optimal subgradient algorithm with subspace search for costly convex optimization problems ⋮ Smoothing unadjusted Langevin algorithms for nonsmooth composite potential functions ⋮ Stochastic incremental mirror descent algorithms with Nesterov smoothing ⋮ OSGA ⋮ Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity ⋮ Impulse noise removal by an adaptive trust-region method ⋮ A Bregman Forward-Backward Linesearch Algorithm for Nonconvex Composite Optimization: Superlinear Convergence to Nonisolated Local Minima
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Nonlinear total variation based noise removal algorithms
- Smooth minimization of non-smooth functions
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- OSGA: a fast subgradient algorithm with optimal complexity
- Gradient methods for minimizing composite functions
- Nonsmooth optimization via quasi-Newton methods
- Universal gradient methods for convex optimization problems
- Introductory lectures on convex optimization. A basic course.
- A variational approach to remove outliers and impulse noise
- Optimal subgradient algorithms for large-scale convex optimization in simple domains
- Solving structured nonsmooth convex optimization with complexity \(\mathcal {O}(\varepsilon ^{-1/2})\)
- Templates for convex cone problems with applications to sparse signal recovery
- Mirror descent and nonlinear projected subgradient methods for convex optimization.
- A double smoothing technique for solving unconstrained nondifferentiable convex optimization problems
- Fine tuning Nesterov's steepest descent algorithm for differentiable convex programming
- Mirror Prox algorithm for multi-term composite minimization and semi-separable problems
- An optimal subgradient algorithm for large-scale bound-constrained convex optimization
- Bundle-level type methods uniformly optimal for smooth and nonsmooth convex optimization
- Atomic Decomposition by Basis Pursuit
- Constrained Total Variation Deblurring Models and Fast Algorithms Based on Alternating Direction Method of Multipliers
- From Sparse Solutions of Systems of Equations to Sparse Modeling of Signals and Images
- Solving Ill-Conditioned and Singular Linear Systems: A Tutorial on Regularization
- Minimizers of Cost-Functions Involving Nonsmooth Data-Fidelity Terms. Application to the Processing of Outliers
- On the acceleration of the double smoothing technique for unconstrained convex optimization problems
- Optimal Primal-Dual Methods for a Class of Saddle Point Problems
- An Accelerated Linearized Alternating Direction Method of Multipliers
- Fast Gradient-Based Algorithms for Constrained Total Variation Image Denoising and Deblurring Problems
- A Douglas--Rachford Type Primal-Dual Method for Solving Inclusions with Mixtures of Composite and Parallel-Sum Type Monotone Operators
- Compressed sensing
- Benchmarking optimization software with performance profiles.
This page was built for publication: Optimal subgradient methods: computational properties for large-scale linear inverse problems