Tuning parameter selection in sparse regression modeling
From MaRDI portal
Publication:1621202
DOI10.1016/j.csda.2012.10.005zbMath1400.62006OpenAlexW1994897986MaRDI QIDQ1621202
Shohei Tateishi, Sadanori Konishi, Kei Hirose
Publication date: 8 November 2018
Published in: Computational Statistics and Data Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.csda.2012.10.005
Lua error in Module:PublicationMSCList at line 37: attempt to index local 'msc_result' (a nil value).
Related Items (5)
Model selection in kernel ridge regression ⋮ A novel elastic net-based NGBMC(1,n) model with multi-objective optimization for nonlinear time series forecasting ⋮ A modified information criterion for model selection ⋮ Sparse estimation via nonconcave penalized likelihood in factor analysis model ⋮ Prediction errors for penalized regressions based on generalized approximate message passing
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Nearly unbiased variable selection under minimax concave penalty
- The Adaptive Lasso and Its Oracle Properties
- A note on the generalized degrees of freedom under the \(L_{1}\) loss function
- The solution path of the generalized lasso
- Enhancing sparsity by reweighted \(\ell _{1}\) minimization
- Slope heuristics: overview and implementation
- On the degrees of freedom in shrinkage estimation
- The composite absolute penalties family for grouped and hierarchical variable selection
- Estimation of the mean of a multivariate normal distribution
- Smoothing noisy data with spline functions: Estimating the correct degree of smoothing by the method of generalized cross-validation
- Heuristics of instability and stabilization in model selection
- Least angle regression. (With discussion)
- Information criteria and statistical modeling.
- Pathwise coordinate optimization
- On the ``degrees of freedom of the lasso
- Shrinkage Tuning Parameter Selection with a Diverging number of Parameters
- SparseNet: Coordinate Descent With Nonconvex Penalties
- How Biased is the Apparent Error Rate of a Prediction Rule?
- On Measuring and Correcting the Effects of Data Mining and Model Selection
- Further analysis of the data by Akaike's information criterion and the finite corrections
- Smoothing Parameter Selection in Nonparametric Regression Using an Improved Akaike Information Criterion
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Adaptive Model Selection
- A Statistical View of Some Chemometrics Regression Tools
- Regularization and Variable Selection Via the Elastic Net
- Model Selection and Estimation in Regression with Grouped Variables
- Tuning parameter selectors for the smoothly clipped absolute deviation method
- Some Comments on C P
- The Estimation of Prediction Error
This page was built for publication: Tuning parameter selection in sparse regression modeling