The Lasso problem and uniqueness
From MaRDI portal
Publication:1951165
DOI10.1214/13-EJS815zbMath1337.62173arXiv1206.0313MaRDI QIDQ1951165
Publication date: 29 May 2013
Published in: Electronic Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1206.0313
Related Items
On cross-validated Lasso in high dimensions, Inference in adaptive regression via the Kac-Rice formula, Model selection consistency of Lasso for empirical data, Hybrid safe-strong rules for efficient optimization in Lasso-type problems, On the uniqueness of solutions for the basis pursuit in the continuum, Exact post-selection inference, with application to the Lasso, One condition for solution uniqueness and robustness of both \(\ell_1\)-synthesis and \(\ell_1\)-analysis minimizations, The inverse problem for conducting defective lattices, The use of vector bootstrapping to improve variable selection precision in Lasso models, Iteration-complexity analysis of a generalized alternating direction method of multipliers, An Alternating Method for Cardinality-Constrained Optimization: A Computational Study for the Best Subset Selection and Sparse Portfolio Problems, Something Borrowed, Something New: Precise Prediction of Outcomes from Diverse Genomic Profiles, Degrees of freedom for piecewise Lipschitz estimators, Random weighting in LASSO regression, A priori sparsification of Galerkin models, Data-Driven Discovery of Closure Models, Lasso, fractional norm and structured sparse estimation using a Hadamard product parametrization, Natural coordinate descent algorithm for \(\ell_1\)-penalised regression in generalised linear models, Rates of convergence of the adaptive elastic net and the post-selection procedure in ultra-high dimensional sparse models, A forward and backward stagewise algorithm for nonconvex loss functions with adaptive Lasso, In defense of LASSO, Sparsest representations and approximations of an underdetermined linear system, Monte Carlo Simulation for Lasso-Type Problems by Estimator Augmentation, The geometry of least squares in the 21st century, Adaptive multi-penalty regularization based on a generalized Lasso path, On the distribution, model selection properties and uniqueness of the Lasso estimator in low and high dimensions, Post-selection inference of generalized linear models based on the lasso and the elastic net, Quadratic growth conditions and uniqueness of optimal solution to Lasso, A Critical Review of LASSO and Its Derivatives for Variable Selection Under Dependence Among Covariates, Primal path algorithm for compositional data analysis, Estimation of the Spatial Weighting Matrix for Spatiotemporal Data under the Presence of Structural Breaks, An inexact proximal generalized alternating direction method of multipliers, Controlling False Discovery Rate Using Gaussian Mirrors, Solution uniqueness of convex piecewise affine functions based optimization with applications to constrained ℓ1 minimization, Debiasing convex regularized estimators and interval estimation in linear models, Comparing solution paths of sparse quadratic minimization with a Stieltjes matrix, A convex-Nonconvex strategy for grouped variable selection, LASSO Reloaded: A Variational Analysis Perspective with Applications to Compressed Sensing, The Geometry of Sparse Analysis Regularization, Convergence rates of the heavy-ball method under the Łojasiewicz property, When Ramanujan meets time-frequency analysis in complicated time series analysis, Variable selection and regularization via arbitrary rectangle-range generalized elastic net, On sparsity‐inducing methods in system identification and state estimation, An alternative to synthetic control for models with many covariates under sparsity, Goodness-of-Fit Tests for High Dimensional Linear Models, LARS-type algorithm for group Lasso, Unnamed Item, Sparse Identification and Estimation of Large-Scale Vector AutoRegressive Moving Averages, Thel1-based sparsification of energy interactions in unsteady lid-driven cavity flow, Kernelized Elastic Net Regularization: Generalization Bounds, and Sparse Recovery, TV-based reconstruction of periodic functions, Best subset, forward stepwise or Lasso? Analysis and recommendations based on extensive comparisons, Maximal solutions of sparse analysis regularization, An Introduction to Compressed Sensing, Oscillation of metropolis-Hastings and simulated annealing algorithms around LASSO estimator, Unnamed Item, Efficient Bayesian regularization for graphical model selection, The homotopy method revisited: Computing solution paths of $\ell _1$-regularized problems, A significance test for the lasso, Discussion: ``A significance test for the lasso, Rejoinder: ``A significance test for the lasso, Consistent parameter estimation for Lasso and approximate message passing, The generalized Lasso problem and uniqueness, Necessary and sufficient conditions of solution uniqueness in 1-norm minimization, Machine learning subsurface flow equations from data, Local and global convergence of a general inertial proximal splitting scheme for minimizing composite functions, Estimation of high-dimensional graphical models using regularized score matching, Leave-one-out cross-validation is risk consistent for Lasso, Robust elastic net estimators for variable selection and identification of proteomic biomarkers, Numerical analysis for conservation laws using \(l_1\) minimization, A study on tuning parameter selection for the high-dimensional lasso, Safe feature elimination for non-negativity constrained convex optimization, Second-order Stein: SURE for SURE and other applications in high-dimensional inference, On the Probabilistic Cauchy Theory for Nonlinear Dispersive PDEs, A partially inexact proximal alternating direction method of multipliers and its iteration-complexity analysis, Free disposal hull condition to verify when efficiency coincides with weak efficiency, On Computationally Tractable Selection of Experiments in Measurement-Constrained Regression Models, Gap Safe screening rules for sparsity enforcing penalties, Sparsest piecewise-linear regression of one-dimensional data, In defense of the indefensible: a very naïve approach to high-dimensional inference, Feature selection for data integration with mixed multiview data, On Lasso refitting strategies, Fast Best Subset Selection: Coordinate Descent and Local Combinatorial Optimization Algorithms, Sparse low-rank separated representation models for learning from data, Two-sided space-time \(L^1\) polynomial approximation of hypographs within polynomial optimal control, On the Length of Post-Model-Selection Confidence Intervals Conditional on Polyhedral Constraints, Unnamed Item, Risk bound of transfer learning using parametric feature mapping and its application to sparse coding, LASSO for streaming data with adaptative filtering, Discussion: ``A significance test for the lasso, Discussion: ``A significance test for the lasso, Discussion: ``A significance test for the lasso, Discussion: ``A significance test for the lasso, Discussion: ``A significance test for the lasso, CLEAR: Covariant LEAst-Square Refitting with Applications to Image Restoration, Analysis of a nonsmooth optimization approach to robust estimation
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Degrees of freedom in lasso problems
- The Dantzig selector and sparsity oracle inequalities
- The solution path of the generalized lasso
- A necessary and sufficient condition for exact sparse recovery by \(\ell_1\) minimization
- Near-ideal model selection by \(\ell _{1}\) minimization
- Sparsity in penalized empirical risk minimization
- Least angle regression. (With discussion)
- On the conditions used to prove oracle results for the Lasso
- Simultaneous analysis of Lasso and Dantzig selector
- Analysis of backtrack algorithms for listing all vertices and all faces of a convex polyhedron.
- Recovery of Exact Sparse Representations in the Presence of Bounded Noise
- Atomic Decomposition by Basis Pursuit
- A new approach to variable selection in least squares problems
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Regularization and Variable Selection Via the Elastic Net
- For most large underdetermined systems of linear equations the minimal 𝓁1‐norm solution is also the sparsest solution
- Convex Analysis
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers