Rejoinder: ``Sparse regression: scalable algorithms and empirical performance
From MaRDI portal
Publication:2225319
DOI10.1214/20-STS701REJMaRDI QIDQ2225319
Jean Pauphilet, Dimitris J. Bertsimas, Bart P. G. Van Parys
Publication date: 8 February 2021
Published in: Statistical Science (Search for Journal in Brave)
Full work available at URL: https://projecteuclid.org/euclid.ss/1605603636
Uses Software
Cites Work
- Best subset selection via a modern optimization lens
- Characterization of the equivalence of robustification and regularization in linear and matrix regression
- Sparse high-dimensional regression: exact scalable algorithms and phase transitions
- Scalable holistic linear regression
- Sparse regression: scalable algorithms and empirical performance
- Best subset, forward stepwise or Lasso? Analysis and recommendations based on extensive comparisons
- A discussion on practical considerations with sparse regression methodologies
- Modern variable selection in action: comment on the papers by HTT and BPV
- A look at robustness and stability of \(\ell_1\)-versus \(\ell_0\)-regularization: discussion of papers by Bertsimas et al. and Hastie et al.
- High-dimensional graphs and variable selection with the Lasso
- OR Forum—An Algorithmic Approach to Linear Regression
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Stable signal recovery from incomplete and inaccurate measurements
- Unnamed Item
- Unnamed Item
- Unnamed Item