An RKHS-based approach to double-penalized regression in high-dimensional partially linear models
DOI10.1016/j.jmva.2018.07.013zbMath1401.62057OpenAlexW2883706650WikidataQ129461820 ScholiaQ129461820MaRDI QIDQ1795582
Wenquan Cui, Haoyang Cheng, Jiajing Sun
Publication date: 16 October 2018
Published in: Journal of Multivariate Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.jmva.2018.07.013
high-dimensional datareproducing kernel Hilbert spaceoracle propertypartially linear modelSacks-Ylvisaker conditionsrepresenter theoremeigen-analysisSCAD (smoothly clipped absolute deviation) penalty
Asymptotic properties of parametric estimators (62F12) Nonparametric regression and quantile regression (62G08) Ridge regression; shrinkage estimators (Lasso) (62J07) Asymptotic properties of nonparametric inference (62G20)
Related Items (1)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- The Adaptive Lasso and Its Oracle Properties
- A partially linear framework for massive heterogeneous data
- Automatic model selection for partially linear models
- Component selection and smoothing in multivariate nonparametric regression
- Penalized variable selection procedure for Cox models with semiparametric relative risk
- SCAD-penalized regression in high-dimensional partially linear models
- Estimating the dimension of a model
- Nonparametric input estimation in physiological systems: Problems, methods, and case studies
- Generalized additive models for current status data
- Consistent covariate selection and post model selection inference in semiparametric regression.
- Least angle regression. (With discussion)
- Estimation and model selection in generalized additive partial linear models for correlated data with diverging number of covariates
- Sparse and efficient estimation for partial spline models with increasing dimension
- An introduction to the Hilbert-Schmidt SVD using iterated Brownian bridge kernels
- Focused information criterion and model averaging for generalized additive partial linear models
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Profiled forward regression for ultrahigh dimensional variable screening in semiparametric partially linear models
- Surface estimation, variable selection, and the nonparametric oracle property
- Smoothing Splines
- Nonparametric Independence Screening in Sparse Ultra-High-Dimensional Additive Models
- Better Subset Regression Using the Nonnegative Garrote
- A Selective Overview of Variable Selection in High Dimensional Feature Space (Invited Review Article)
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Sure Independence Screening for Ultrahigh Dimensional Feature Space
- Two-stage model selection procedures in partially linear regression
- A Statistical View of Some Chemometrics Regression Tools
- GENERALIZED ADDITIVE PARTIAL LINEAR MODELS WITH HIGH-DIMENSIONAL COVARIATES
- New Estimation and Model Selection Procedures for Semiparametric Modeling in Longitudinal Data Analysis
- Estimation and variable selection for semiparametric additive partial linear models
This page was built for publication: An RKHS-based approach to double-penalized regression in high-dimensional partially linear models