Robust Decoding from 1-Bit Compressive Sampling with Ordinary and Regularized Least Squares
From MaRDI portal
Publication:3174766
DOI10.1137/17M1154102zbMath1395.49033arXiv1711.01206OpenAlexW2810437963MaRDI QIDQ3174766
Xiliang Lu, Jian Huang, Li-ping Zhu, Yu Ling Jiao
Publication date: 18 July 2018
Published in: SIAM Journal on Scientific Computing (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1711.01206
continuation1-bit compressive sensingprimal dual active set algorithm\(\ell_1\)-regularized least squaresone-step convergence
Newton-type methods (49M15) Numerical methods for inverse problems for boundary value problems involving PDEs (65N21) Inverse problems in optimal control (49N45)
Related Items
REMI: REGRESSION WITH MARGINAL INFORMATION AND ITS APPLICATION IN GENOME-WIDE ASSOCIATION STUDIES, Distributed Sparse Composite Quantile Regression in Ultrahigh Dimensions, A primal dual active set with continuation algorithm for high-dimensional nonconvex SICA-penalized regression, Distributed Decoding From Heterogeneous 1-Bit Compressive Measurements, Just least squares: binary compressive sampling with low generative intrinsic dimension, A unified primal dual active set algorithm for nonconvex sparse recovery, Quadratic Convergence of Smoothing Newton's Method for 0/1 Loss Optimization, An Algorithm Solving Compressive Sensing Problem Based on Maximal Monotone Operators
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A mathematical introduction to compressive sensing
- Matrix-free interior point method for compressed sensing problems
- A primal dual active set with continuation algorithm for the \(\ell^0\)-regularized optimization problem
- Noisy 1-bit compressive sensing: models and algorithms
- Introductory lectures on convex optimization. A basic course.
- A nonsmooth version of Newton's method
- One-bit compressed sensing with non-Gaussian measurements
- Information criteria and statistical modeling.
- Estimation in High Dimensions: A Geometric Perspective
- One-Bit Compressed Sensing by Linear Programming
- Inverse Problems
- Proximal Splitting Methods in Signal Processing
- Optimization with Sparsity-Inducing Penalties
- One-Bit Compressive Sensing With Norm Estimation
- Robust 1-Bit Compressive Sensing via Binary Stable Embeddings of Sparse Vectors
- Robust 1-bit Compressed Sensing and Sparse Logistic Regression: A Convex Programming Approach
- Selected Works of David Brillinger
- A Regularization Parameter for Nonsmooth Tikhonov Regularization
- A Duality-Based Splitting Method for $\ell^1$-$TV$ Image Restoration with Automatic Regularization Parameter Choice
- A Preconditioner for A Primal-Dual Newton Conjugate Gradient Method for Compressed Sensing Problems
- Lagrange Multiplier Approach to Variational Problems and Applications
- Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
- A Semismooth Newton Method for $\mathrm{L}^1$ Data Fitting with Automatic Choice of Regularization Parameters and Noise Calibration
- Atomic Decomposition by Basis Pursuit
- A new approach to variable selection in least squares problems
- Trust, But Verify: Fast and Accurate Signal Recovery From 1-Bit Compressive Measurements
- Robust 1-bit Compressive Sensing Using Adaptive Outlier Pursuit
- A Primal Dual Active Set Algorithm With Continuation for Compressed Sensing
- High-dimensional estimation with geometric constraints: Table 1.
- Iterative parameter choice by discrepancy principle
- One-Bit Compressed Sensing by Greedy Algorithms
- Signal Recovery by Proximal Forward-Backward Splitting
- Compressed sensing
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers