Regularized estimation for the least absolute relative error models with a diverging number of covariates
From MaRDI portal
Publication:1659468
DOI10.1016/j.csda.2015.10.012zbMath1468.62213OpenAlexW2190785170MaRDI QIDQ1659468
Hu Yang, Xiaochao Xia, Zhi Liu
Publication date: 15 August 2018
Published in: Computational Statistics and Data Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.csda.2015.10.012
variable selectionleast squares approximationoracle propertiesdiverging number of covariatesleast absolute relative error
Computational methods for problems pertaining to statistics (62-08) Ridge regression; shrinkage estimators (Lasso) (62J07)
Related Items (8)
Asymptotics for least product relative error estimation and empirical likelihood with longitudinal data ⋮ Estimation and empirical likelihood for single-index multiplicative models ⋮ Optimal subsampling for multiplicative regression with massive data ⋮ A new relative error estimation for partially linear multiplicative model ⋮ Incorporating relative error criterion to conformal prediction for positive data ⋮ Penalized relative error estimation of functional multiplicative regression models with locally sparse properties ⋮ Optimal subsampling for least absolute relative error estimators with massive data ⋮ Nonconcave penalized M-estimation for the least absolute relative errors model
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Nearly unbiased variable selection under minimax concave penalty
- The Adaptive Lasso and Its Oracle Properties
- Empirical likelihood for least absolute relative error regression
- Variable selection in the accelerated failure time model via the bridge method
- Least product relative error estimation
- Rejoinder: One-step sparse estimates in nonconcave penalized likelihood models
- Variable selection in nonparametric additive models
- High-dimensional additive modeling
- Hedonic housing prices and the demand for clean air
- Relative-error prediction
- Nonconcave penalized likelihood with a diverging number of parameters.
- Least angle regression. (With discussion)
- Adaptive robust variable selection
- Shrinkage Tuning Parameter Selection with a Diverging number of Parameters
- Regularized Estimation in the Accelerated Failure Time Model with High-Dimensional Covariates
- Unified LASSO Estimation by Least Squares Approximation
- Predicting software errors, during development, using nonlinear regression models: a comparative study
- Prediction, Linear Regression and the Minimum Sum of Relative Errors
- Rank-based inference for the accelerated failure time model
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Quantile Regression for Analyzing Heterogeneity in Ultra-High Dimension
- Nonparametric Independence Screening in Sparse Ultra-High-Dimensional Varying Coefficient Models
- Least Absolute Relative Error Estimation
- Group descent algorithms for nonconvex penalized linear and logistic regression models with grouped predictors
This page was built for publication: Regularized estimation for the least absolute relative error models with a diverging number of covariates