Deprecated: $wgMWOAuthSharedUserIDs=false is deprecated, set $wgMWOAuthSharedUserIDs=true, $wgMWOAuthSharedUserSource='local' instead [Called from MediaWiki\HookContainer\HookContainer::run in /var/www/html/w/includes/HookContainer/HookContainer.php at line 135] in /var/www/html/w/includes/Debug/MWDebug.php on line 372
scientific article - MaRDI portal

scientific article

From MaRDI portal
Publication:3174075

zbMath1222.68339MaRDI QIDQ3174075

Ding-Xuan Zhou, Yiming Ying

Publication date: 12 October 2011

Full work available at URL: http://www.jmlr.org/papers/v8/ying07a.html

Title: zbMATH Open Web Interface contents unavailable due to conflicting licenses.



Related Items (35)

Learning with sample dependent hypothesis spacesThe optimal solution of multi-kernel regularization learningFast learning rate of non-sparse multiple kernel learning and optimal regularization strategiesMulti-kernel regularized classifiersError analysis on regularized regression based on the maximum correntropy criterionSummation of Gaussian shifts as Jacobi's third theta functionConvergence of online pairwise regression learning with quadratic lossQuantitative convergence analysis of kernel based large-margin unified machinesOptimal regression rates for SVMs using Gaussian kernelsConditional quantiles with varying GaussiansError bounds for \(l^p\)-norm multiple kernel learning with least square lossRefined Rademacher Chaos Complexity Bounds with Applications to the Multikernel Learning ProblemA Note on Support Vector Machines with Polynomial KernelsLearning Rates for Classification with Gaussian KernelsConvergence analysis of online algorithmsLearning performance of regularized regression with multiscale kernels based on Markov observationsOrthogonality from disjoint support in reproducing kernel Hilbert spacesParzen windows for multi-class classificationLearning and approximation by Gaussians on Riemannian manifoldsLearning the coordinate gradientsClassification with Gaussians and convex loss. II: Improving error bounds by noise conditionsLearning rates of multi-kernel regularized regressionApproximation of kernel matrices by circulant matrices and its application in kernel selection methodsNonlinear approximation using Gaussian kernelsSome properties of Gaussian reproducing kernel Hilbert spaces and their implications for function approximation and learning theoryUnregularized online algorithms with varying GaussiansDistributed regularized least squares with flexible Gaussian kernelsError Estimates for Multivariate Regression on Discretized Function SpacesLeast square regularized regression for multitask learningOn extension theorems and their connection to universal consistency in machine learningError bounds for learning the kernelUnnamed ItemHigh order Parzen windows and randomized samplingOnline Classification with Varying GaussiansOptimal learning with Gaussians and correntropy loss







This page was built for publication: