Asymptotically faster estimation of high-dimensional additive models using subspace learning
From MaRDI portal
Publication:6641032
DOI10.1111/sjos.12756MaRDI QIDQ6641032
Kejun He, Shiyuan He, Jianhua Z. Huang
Publication date: 20 November 2024
Published in: Scandinavian Journal of Statistics (Search for Journal in Brave)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- The Adaptive Lasso and Its Oracle Properties
- Fast learning rate of multiple kernel learning: trade-off between sparsity and smoothness
- Asymptotic properties of Lasso+mLS and Lasso+Ridge in sparse high-dimensional linear regression
- Oracle inequalities in empirical risk minimization and sparse recovery problems. École d'Été de Probabilités de Saint-Flour XXXVIII-2008.
- Sparsity in multiple kernel learning
- Consistent group selection in high-dimensional linear regression
- Joint variable and rank selection for parsimonious estimation of high-dimensional matrices
- Variable selection in nonparametric additive models
- High-dimensional additive modeling
- Additive regression and other nonparametric models
- Local asymptotics for regression splines and confidence regions
- Optimal global rates of convergence for nonparametric regression
- On the conditions used to prove oracle results for the Lasso
- A significance test for the lasso
- Doubly penalized estimation in additive regression with high-dimensional data
- Simultaneous analysis of Lasso and Dantzig selector
- Asymptotic properties of bridge estimators in sparse high-dimensional regression models
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Variable selection and estimation in high-dimensional varying-coefficient models
- A Selective Overview of Variable Selection in High Dimensional Feature Space (Invited Review Article)
- Modern Multivariate Statistical Techniques
- Decoding by Linear Programming
- High-Dimensional Statistics
- Sparse Additive Models
- Sparse Reduced-Rank Regression for Simultaneous Dimension Reduction and Variable Selection
- Dimensionality Reduction and Variable Selection in Multivariate Varying-Coefficient Models With a Large Number of Covariates
- Improved Estimation of High-dimensional Additive Models Using Subspace Learning
- Selective factor extraction in high dimensions
- Minimax-optimal rates for sparse additive models over kernel classes via convex programming
- Model Selection and Estimation in Regression with Grouped Variables
- A practical guide to splines.
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers
- Sparse Reduced Rank Huber Regression in High Dimensions
This page was built for publication: Asymptotically faster estimation of high-dimensional additive models using subspace learning