Modeling interactive components by coordinate kernel polynomial models
From MaRDI portal
Publication:2063336
DOI10.3934/mfc.2020010zbMath1485.68214OpenAlexW3033656597MaRDI QIDQ2063336
Publication date: 11 January 2022
Published in: Mathematical Foundations of Computing (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.3934/mfc.2020010
kernel methodgeneralizationinformation criterionreparametrizationcoordinate kernel polynomial modelinteractive component
Nonparametric regression and quantile regression (62G08) Learning and adaptive systems in artificial intelligence (68T05) Rate of convergence, degree of approximation (41A25)
Related Items (12)
Rates of convergence of randomized Kaczmarz algorithms in Hilbert spaces ⋮ Distributed semi-supervised regression learning with coefficient regularization ⋮ Inequalities involving Berezin norm and Berezin number ⋮ Rates of approximation by ReLU shallow neural networks ⋮ Neural network interpolation operators optimized by Lagrange polynomial ⋮ Convergence on sequences of Szász-Jakimovski-Leviatan type operators and related results ⋮ Approximation properties of exponential type operators connected to \(p(x)=2x^{3/2}\) ⋮ Rate of convergence of Stancu type modified \(q\)-Gamma operators for functions with derivatives of bounded variation ⋮ Dunkl analouge of Szász Schurer Beta bivariate operators ⋮ On Szász-Durrmeyer type modification using Gould Hopper polynomials ⋮ Coefficient-based regularized distribution regression ⋮ On the speed of uniform convergence in Mercer's theorem
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Component selection and smoothing in multivariate nonparametric regression
- Multi-kernel regularized classifiers
- A nonlinear multi-dimensional variable selection method for high dimensional data: sparse MAVE
- Gaussian Markov distributions over finite graphs
- Wrappers for feature subset selection
- Multivariate adaptive regression splines
- Robust variable selection through MAVE
- Rademacher Chaos Complexities for Learning the Kernel Problem
- Radial Basis Functions
- 10.1162/1532443041424300
- 10.1162/153244303321897690
- Overlapping sliced inverse regression for dimension reduction
- Semi-supervised learning with summary statistics
- Distributed learning with indefinite kernels
- Regularization and Variable Selection Via the Elastic Net
- High-Dimensional Sparse Factor Modeling: Applications in Gene Expression Genomics
- Sliced Inverse Regression with Regularizations
- Gene selection for cancer classification using support vector machines
This page was built for publication: Modeling interactive components by coordinate kernel polynomial models