Multikernel Regression with Sparsity Constraint
DOI10.1137/20M1318882zbMath1479.46026arXiv1811.00836OpenAlexW3127203349MaRDI QIDQ4999353
Shayan Aziznejad, Michael Unser
Publication date: 6 July 2021
Published in: SIAM Journal on Mathematics of Data Science (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1811.00836
generalized LASSOregularization theoryrepresenter theoremgeneralized total variationmultiple-kernel learning
Nonparametric regression and quantile regression (62G08) Learning and adaptive systems in artificial intelligence (68T05) Hilbert spaces with reproducing kernels (= (proper) functional Hilbert spaces, including de Branges-Rovnyak and other structured spaces) (46E22) Spaces of measures (46E27) Linear operators and ill-posed problems, regularization (47A52)
Related Items (4)
Uses Software
Cites Work
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Exact reconstruction using Beurling minimal extrapolation
- Solving support vector machines in reproducing kernel Banach spaces with positive definite functions
- Super-resolution from noisy data
- Concentration estimates for learning with \(\ell ^{1}\)-regularizer and data dependent hypothesis spaces
- Support recovery for sparse super-resolution of positive measures
- Regularized learning in Banach spaces as an optimization problem: representer theorems
- Théorie des distributions à valeurs vectorielles
- Theory of Bessel potentials. I
- Regularization in kernel learning
- Exact support recovery for sparse spikes deconvolution
- Spline solutions to L\(^1\) extremal problems in one and several variables
- Locally adaptive regression splines
- Convex functional analysis
- Sparse kernel learning with LASSO and Bayesian inference algorithm
- A distribution-free theory of nonparametric regression
- An approximation theory approach to learning with \(\ell^1\) regularization
- Optimal regression rates for SVMs using Gaussian kernels
- Regularization networks and support vector machines
- Optimal rates for the regularized least-squares algorithm
- Some results on Tchebycheffian spline functions and stochastic processes
- Learning Theory
- An Introduction to Sparse Stochastic Processes
- Support Vector Machines
- An Explicit Description of the Reproducing Kernel Hilbert Spaces of Gaussian RBF Kernels
- Iteratively reweighted least squares minimization for sparse recovery
- Splines Are Universal Solutions of Linear Inverse Problems with Generalized TV Regularization
- Super-resolution of point sources via convex programming
- Continuous-Domain Solutions of Linear Inverse Problems With Tikhonov Versus Generalized TV Regularization
- 10.1162/1532443041827925
- Inverse problems in spaces of measures
- Pocket guide to solve inverse problems with GlobalBioIm
- The sliding Frank–Wolfe algorithm and its application to super-resolution microscopy
- B-Spline-Based Exact Discretization of Continuous-Domain Inverse Problems With Generalized TV Regularization
- Breaking the Curse of Dimensionality with Convex Neural Networks
- Distributions and Their Hermite Expansions
- Theory of Reproducing Kernels
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: Multikernel Regression with Sparsity Constraint