scientific article; zbMATH DE number 7415100
From MaRDI portal
Publication:5159432
Takuo Matsubara, Chris J. Oates, François-Xavier Briol
Publication date: 27 October 2021
Full work available at URL: https://arxiv.org/abs/2010.08488
Title: zbMATH Open Web Interface contents unavailable due to conflicting licenses.
Gaussian processesstatistical learning theoryprior selectionBayesian neural networksridgelet transform
Related Items (1)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A simple lemma on greedy approximation in Hilbert space and convergence rates for projection pursuit regression and neural network training
- Interpolation of spatial data. Some theory for kriging
- Bayesian learning for neural networks
- Neural network with unbounded activation functions is universal approximator
- An introduction to manifolds
- Introduction to empirical processes and semiparametric inference
- An introduction to infinite-dimensional analysis
- Integrability spaces for the Fourier transform of a function of bounded variation
- Stochastic processes with sample paths in reproducing kernel Hilbert spaces
- Mathematical Foundations of Infinite-Dimensional Statistical Models
- Support Vector Machines
- Classical Fourier Analysis
- The finite ridgelet transform for image representation
- Universal approximation bounds for superpositions of a sigmoidal function
- Bounds on rates of variable-basis and neural-network approximation
- How Deep Are Deep Gaussian Processes?
- Kernel Mean Embedding of Distributions: A Review and Beyond
- High-Dimensional Statistics
- Theoretical Foundations of Functional Data Analysis, with an Introduction to Linear Operators
- Breaking the Curse of Dimensionality with Convex Neural Networks
- On Positivity of Fourier Transforms
- Wide neural networks of any depth evolve as linear models under gradient descent *
- Scattered Data Approximation
- Approximation by superpositions of a sigmoidal function
This page was built for publication: