Convergence analysis of an empirical eigenfunction-based ranking algorithm with truncated sparsity
From MaRDI portal
Publication:1722329
DOI10.1155/2014/197476zbMath1470.68200OpenAlexW2109579242WikidataQ59035969 ScholiaQ59035969MaRDI QIDQ1722329
Min Xu, Shaofan Wang, Qin Fang
Publication date: 14 February 2019
Published in: Abstract and Applied Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1155/2014/197476
General nonlinear regression (62J02) Learning and adaptive systems in artificial intelligence (68T05) Statistical ranking and selection procedures (62F07)
Related Items
A linear functional strategy for regularized ranking, On the convergence rate and some applications of regularized ranking algorithms, Regularized Nyström subsampling in regression and ranking problems under general smoothness assumptions
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- An empirical feature-based learning algorithm producing sparse approximations
- The convergence rate of a regularized ranking algorithm
- Model selection for regularized least-squares algorithm in learning theory
- On regularization algorithms in learning theory
- Variation of discrete spectra
- The Hoffman-Wielandt inequality in infinite dimensions
- Random matrix approximation of spectra of integral operators
- Ranking and empirical minimization of \(U\)-statistics
- Learning theory estimates via integral operators and their approximations
- The variation of the spectrum of a normal matrix
- INDEFINITE KERNEL NETWORK WITH DEPENDENT SAMPLING
- Learning Theory
- Theory of Reproducing Kernels