Optimal and Safe Estimation for High-Dimensional Semi-Supervised Learning
From MaRDI portal
Publication:6651378
DOI10.1080/01621459.2023.2277409MaRDI QIDQ6651378
Yang Ning, Jiwei Zhao, Heping Zhang, Unnamed Author
Publication date: 10 December 2024
Published in: Journal of the American Statistical Association (Search for Journal in Brave)
Cites Work
- Unnamed Item
- Unnamed Item
- An analysis of penalized interaction models
- Transductive versions of the Lasso and the Dantzig selector
- High-dimensional inference in misspecified linear models
- Deviation optimal learning using greedy \(Q\)-aggregation
- Component selection and smoothing in multivariate nonparametric regression
- Variable selection in nonparametric additive models
- High-dimensional additive modeling
- On asymptotically efficient estimation in semiparametric models
- On adaptive estimation
- On the prediction loss of the Lasso in the partially labeled setting
- Minimax estimation of a functional on a structured high-dimensional model
- Minimax risks for sparse regressions: ultra-high dimensional phenomenons
- Slope meets Lasso: improved oracle bounds and optimality
- Models as approximations. I. Consequences illustrated with linear regression
- A survey on semi-supervised learning
- Semi-supervised inference: general theory and estimation of means
- Simultaneous analysis of Lasso and Dantzig selector
- Efficient and adaptive linear regression in semi-supervised settings
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Assumption Lean Regression
- Double/debiased machine learning for treatment and structural parameters
- Minimax-optimal rates for sparse additive models over kernel classes via convex programming
- Transfer Learning for High-Dimensional Linear Regression: Prediction, Estimation and Minimax Optimality
This page was built for publication: Optimal and Safe Estimation for High-Dimensional Semi-Supervised Learning