GENERALIZATION BOUNDS OF REGULARIZATION ALGORITHMS DERIVED SIMULTANEOUSLY THROUGH HYPOTHESIS SPACE COMPLEXITY, ALGORITHMIC STABILITY AND DATA QUALITY
From MaRDI portal
Publication:3087503
DOI10.1142/S0219691311004213zbMath1219.62003MaRDI QIDQ3087503
Bin Zou, Hai Zhang, Xiang Yu Chang, Zong Ben Xu
Publication date: 16 August 2011
Published in: International Journal of Wavelets, Multiresolution and Information Processing (Search for Journal in Brave)
Nonparametric estimation (62G05) Foundations and philosophical topics in statistics (62A01) Learning and adaptive systems in artificial intelligence (68T05) Statistical aspects of information-theoretic topics (62B10)
Related Items (4)
Learning rates for the kernel regularized regression with a differentiable strongly convex loss ⋮ REGULARIZED LEAST SQUARE ALGORITHM WITH TWO KERNELS ⋮ ELASTIC-NET REGULARIZATION FOR LOW-RANK MATRIX RECOVERY ⋮ ATTRIBUTE REDUCTION OF CONCEPT LATTICE BASED ON IRREDUCIBLE ELEMENTS
Cites Work
- Unnamed Item
- Learning from dependent observations
- Best choices for regularization parameters in learning theory: on the bias-variance problem.
- Regularization networks and support vector machines
- Learning rates of least-square regularized regression
- On the mathematical foundations of learning
- Learning Theory
- Capacity of reproducing kernel spaces in learning theory
- The sample complexity of pattern classification with neural networks: the size of the weights is more important than the size of the network
- SVM Soft Margin Classifiers: Linear Programming versus Quadratic Programming
- Shannon sampling and function reconstruction from point values
This page was built for publication: GENERALIZATION BOUNDS OF REGULARIZATION ALGORITHMS DERIVED SIMULTANEOUSLY THROUGH HYPOTHESIS SPACE COMPLEXITY, ALGORITHMIC STABILITY AND DATA QUALITY