Coefficient-based regularized regression with dependent and unbounded sampling
From MaRDI portal
Publication:2819179
DOI10.1142/S0219691316500399zbMath1419.62086OpenAlexW2468357046MaRDI QIDQ2819179
Publication date: 28 September 2016
Published in: International Journal of Wavelets, Multiresolution and Information Processing (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1142/s0219691316500399
error boundsreproducing kernel Hilbert spacelearning ratescoefficient-based regularization schememixing samples
Nonparametric regression and quantile regression (62G08) Learning and adaptive systems in artificial intelligence (68T05)
Related Items (6)
Error analysis of the moving least-squares method with non-identical sampling ⋮ Convergence rate for the moving least-squares learning with dependent sampling ⋮ Error analysis for \(l^q\)-coefficient regularized moving least-square regression ⋮ Coefficient-based \(l^q\)-regularized regression with indefinite kernels and unbounded sampling ⋮ Analysis of Regression Algorithms with Unbounded Sampling ⋮ Error analysis of the moving least-squares regression learning algorithm with β-mixing and non-identical sampling
Cites Work
- Regularized least square regression with unbounded and dependent sampling
- Learning rates for least square regressions with coefficient regularization
- On the mathematical foundations of learning
- Application of integral operator for vector-valued regression learning
- Learning by atomic norm regularization with polynomial kernels
- Optimal rate for support vector machine regression with Markov chain samples
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: Coefficient-based regularized regression with dependent and unbounded sampling