Fast rates by transferring from auxiliary hypotheses
From MaRDI portal
Publication:2361574
DOI10.1007/S10994-016-5594-4zbMath1453.68153arXiv1412.1619OpenAlexW2509517678MaRDI QIDQ2361574
Ilja Kuzborskij, Francesco Orabona
Publication date: 30 June 2017
Published in: Machine Learning (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1412.1619
Rademacher complexitytransfer learningdomain adaptationfast-rate generalization boundssmooth loss functionsstrongly-convex regularizers
General nonlinear regression (62J02) Learning and adaptive systems in artificial intelligence (68T05)
Related Items (3)
Handling concept drift via model reuse ⋮ Unnamed Item ⋮ Risk bound of transfer learning using parametric feature mapping and its application to sparse coding
Cites Work
- A new learning paradigm: learning using privileged information
- Domain adaptation and sample bias correction theory and algorithm for regression
- Model selection for regularized least-squares algorithm in learning theory
- A theory of learning from different domains
- Local Rademacher complexities
- On the Hardness of Domain Adaptation and the Utility of Unlabeled Target Samples
- 10.1162/153244302760200704
- 10.1162/153244303321897690
- Understanding Machine Learning
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: Fast rates by transferring from auxiliary hypotheses