Discrete optimization methods for group model selection in compressed sensing
From MaRDI portal
Publication:2235146
DOI10.1007/s10107-020-01529-7zbMath1478.90058arXiv1904.01542OpenAlexW3034627166MaRDI QIDQ2235146
Oliver Schaudt, Jannis Kurtz, Bubacarr Bah
Publication date: 20 October 2021
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1904.01542
Applications of mathematical programming (90C90) Integer programming (90C10) Combinatorial optimization (90C27)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A mathematical introduction to compressive sensing
- Oracle inequalities and optimal inference under group sparsity
- Robust combinatorial optimization under convex and discrete cost uncertainty
- Iterative hard thresholding for compressed sensing
- The benefit of group sparsity
- Graph minors. V. Excluding a planar graph
- Partitioning procedures for solving mixed-variables programming problems
- Treewidth. Computations and approximations
- Robust discrete optimization and its applications
- Benders decomposition for very large scale partial set covering and maximal covering location problems
- Deterministic constructions of compressed sensing matrices
- Generalized Benders decomposition
- Group-Sparse Model Selection: Hardness and Relaxations
- Approximation Algorithms for Model-Based Compressive Sensing
- Least Squares Superposition Codes With Bernoulli Dictionary are Still Reliable at Rates up to Capacity
- Expander graphs and their applications
- Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
- Decoding by Linear Programming
- Efficient and Robust Compressed Sensing Using Optimized Expander Graphs
- Robust Recovery of Signals From a Structured Union of Subspaces
- Approximations for Monotone and Nonmonotone Submodular Maximization with Knapsack Constraints
- Simultaneous Support Recovery in High Dimensions: Benefits and Perils of Block $\ell _{1}/\ell _{\infty} $-Regularization
- Model-Based Compressive Sensing
- On Model-Based RIP-1 Matrices
- Fast Sparse Superposition Codes Have Near Exponential Error Probability for <formula formulatype="inline"><tex Notation="TeX">$R<{\cal C}$</tex></formula>
- Model-based Sketching and Recovery with Expanders
- Approximation-Tolerant Model-Based Compressive Sensing
- Learning with Structured Sparsity
- Smooth Orthogonal Drawings of Planar Graphs
- Model Selection and Estimation in Regression with Grouped Variables
- Stable signal recovery from incomplete and inaccurate measurements
- A Linear-Time Algorithm for Finding Tree-Decompositions of Small Treewidth
- Compressed sensing
This page was built for publication: Discrete optimization methods for group model selection in compressed sensing