Benefit of Interpolation in Nearest Neighbor Algorithms
From MaRDI portal
Publication:5089734
DOI10.1137/21M1437457zbMath1493.62180arXiv2202.11817OpenAlexW2976673394MaRDI QIDQ5089734
Qifan Song, Yue Xing, Guang Cheng
Publication date: 15 July 2022
Published in: SIAM Journal on Mathematics of Data Science (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2202.11817
Related Items (2)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Optimal weighted nearest neighbour classifiers
- Fast learning rates for plug-in classifiers
- Optimal aggregation of classifiers in statistical learning.
- Fat-shattering and the learnability of real-valued functions
- Surprises in high-dimensional ridgeless least squares interpolation
- Local nearest neighbour classification with applications to semi-supervised learning
- Optimal Dual Martingales, Their Analysis, and Application to New Algorithms for Bermudan Products
- Distribution-free exponential error bound for nearest neighbor pattern classification
- 10.1162/153244302760200704
- Two Models of Double Descent for Weak Features
- Benign overfitting in linear regression
- Reconciling modern machine-learning practice and the classical bias–variance trade-off
- Convergence of the nearest neighbor rule
This page was built for publication: Benefit of Interpolation in Nearest Neighbor Algorithms