Theory and fast learned solver for \(\ell^1\)-TV regularization
From MaRDI portal
Publication:6659670
DOI10.1088/1361-6420/AD99F9MaRDI QIDQ6659670
Jian-Jun Wang, Xin-Ling Liu, Bangti Jin
Publication date: 9 January 2025
Published in: Inverse Problems (Search for Journal in Brave)
Convex programming (90C25) Signal theory (characterization, reconstruction, filtering, etc.) (94A12) Image processing (compression, reconstruction, etc.) in information and communication theory (94A08)
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Stable recovery of sparse signals via \(\ell_p\)-minimization
- Compressed sensing with coherent and redundant dictionaries
- A primal dual active set with continuation algorithm for the \(\ell^0\)-regularized optimization problem
- Split Bregman method for large scale fused Lasso
- Analysis \(\ell_1\)-recovery with frames and Gaussian measurements
- Linearized alternating direction method of multipliers for sparse group and fused Lasso models
- Low-rank matrix recovery via regularized nuclear norm minimization
- A unified primal dual active set algorithm for nonconvex sparse recovery
- Group sparse recovery in impulsive noise via alternating direction method of multipliers
- \(\ell^1\)-analysis minimization and generalized (co-)sparsity: when does recovery succeed?
- Convex Recovery of a Structured Signal from Independent Random Linear Measurements
- Smoothing and first order methods: a unified framework
- Convergence Analysis of a Proximal-Like Minimization Algorithm Using Bregman Functions
- Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
- On Efficiently Solving the Subproblems of a Level-Set Method for Fused Lasso Problems
- Robust analysis ℓ1-recovery from Gaussian measurements and total variation minimization
- First-Order Methods in Optimization
- Guarantees of total variation minimization for signal recovery
- A Sharp Condition for Exact Support Recovery With Orthogonal Matching Pursuit
- High-Dimensional Probability
- An iterative thresholding algorithm for linear inverse problems with a sparsity constraint
- Sparsity and Smoothness Via the Fused Lasso
- Sparsity regularization for parameter identification problems
- One-bit compressed sensing via ℓ p (0 < p < 1)-minimization method
- Compressed Sensing with 1D Total Variation: Breaking Sample Complexity Barriers via Non-Uniform Recovery
- Selecting Regularization Parameters for Nuclear Norm--Type Minimization Problems
- Living on the edge: phase transitions in convex programs with random data
- On the Error in Phase Transition Computations for Compressed Sensing
- A Remark on the Restricted Isometry Property in Orthogonal Matching Pursuit
- Morozov's discrepancy principle for Tikhonov-type functionals with nonlinear operators
- Parameter selection for total-variation-based image restoration using discrepancy principle
- For most large underdetermined systems of linear equations the minimal 𝓁1‐norm solution is also the sparsest solution
- Proximité et dualité dans un espace hilbertien
- Unsupervised knowledge-transfer for learned image reconstruction*
- Compressed sensing
This page was built for publication: Theory and fast learned solver for \(\ell^1\)-TV regularization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6659670)