DOI10.1214/10-AOS825zbMath1204.62086arXiv1211.2998OpenAlexW2058007550MaRDI QIDQ620564
Ming Yuan, Vladimir I. Koltchinskii
Publication date: 19 January 2011
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1211.2998
Automatic Component Selection in Additive Modeling of French National Electricity Load Forecasting,
A reproducing kernel Hilbert space approach to high dimensional partially varying coefficient model,
Improved Estimation of High-dimensional Additive Models Using Subspace Learning,
Extreme eigenvalues of nonlinear correlation matrices with applications to additive models,
Nonlinear Variable Selection via Deep Neural Networks,
Hierarchical Total Variations and Doubly Penalized ANOVA Modeling for Multivariate Nonparametric Regression,
Fast learning rate of non-sparse multiple kernel learning and optimal regularization strategies,
Multiple Kernel Learningの学習理論,
Fast learning rate of multiple kernel learning: trade-off between sparsity and smoothness,
The two-sample problem for Poisson processes: adaptive tests with a nonasymptotic wild bootstrap approach,
A unified penalized method for sparse additive quantile models: an RKHS approach,
Grouping strategies and thresholding for high dimensional linear models,
Grouped variable selection with discrete optimization: computational and statistical perspectives,
Variable selection in additive quantile regression using nonconcave penalty,
Kernel Ordinary Differential Equations,
Decentralized learning over a network with Nyström approximation using SGD,
Estimates on learning rates for multi-penalty distribution regression,
Metamodel construction for sensitivity analysis,
Regularizers for structured sparsity,
Semiparametric regression models with additive nonparametric components and high dimensional parametric components,
PAC-Bayesian estimation and prediction in sparse additive models,
Learning Rates for Classification with Gaussian Kernels,
Statistical inference in compound functional models,
Kernel Meets Sieve: Post-Regularization Confidence Bands for Sparse Additive Model,
Optimal prediction for high-dimensional functional quantile regression in reproducing kernel Hilbert spaces,
Minimax optimal estimation in partially linear additive models under high dimension,
Randomized sketches for kernel CCA,
Additive model selection,
Minimax and Adaptive Prediction for Functional Linear Regression,
Oracle inequalities for sparse additive quantile regression in reproducing kernel Hilbert space,
Optimal learning rates of \(l^p\)-type multiple kernel learning under general conditions,
Learning general sparse additive models from point queries in high dimensions,
Regularizing Double Machine Learning in Partially Linear Endogenous Models,
Kernel Knockoffs Selection for Nonparametric Additive Models,
A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers,
Learning non-parametric basis independent models from point queries via low-rank methods,
Tight conditions for consistency of variable selection in the context of high dimensionality,
A semiparametric model for matrix regression,
Information based complexity for high dimensional sparse functions,
Approximate nonparametric quantile regression in reproducing kernel Hilbert spaces via random projection,
Sparse RKHS estimation via globally convex optimization and its application in LPV-IO identification,
Inference for high-dimensional varying-coefficient quantile regression,
Nonparametric variable screening for multivariate additive models,
Rates of contraction with respect to \(L_2\)-distance for Bayesian nonparametric regression,
Doubly penalized estimation in additive regression with high-dimensional data,
Learning rates for partially linear support vector machine in high dimensions,
Minimax-optimal nonparametric regression in high dimensions