Deep regularization and direct training of the inner layers of neural networks with kernel flows
From MaRDI portal
Publication:2077570
DOI10.1016/j.physd.2021.132952OpenAlexW3007786121MaRDI QIDQ2077570
Publication date: 21 February 2022
Published in: Physica D (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2002.08335
artificial neural networksmachine learningimage classificationGaussian process regressioninner layer trainingkernel flows
Related Items (4)
Computational graph completion ⋮ Learning dynamical systems from data: a simple cross-validation perspective. III: Irregularly-sampled time series ⋮ Learning ``best kernels from data in Gaussian process regression. With application to aerodynamics ⋮ Do ideas have shape? Idea registration as the continuous limit of artificial neural networks
Uses Software
Cites Work
- Unnamed Item
- A comparison of generalized cross validation and modified maximum likelihood for estimating the parameters of a stochastic process
- Cross validation and maximum likelihood estimations of hyper-parameters of Gaussian processes with model misspecification
- Kernel flows: from learning kernels from data into the abyss
- Bayesian Inference for Non-Stationary Spatial Covariance Structure via Spatial Deformations
- Can a finite element method perform arbitrarily badly?
- Operator-Adapted Wavelets, Fast Solvers, and Numerical Homogenization
- Prevalence of neural collapse during the terminal phase of deep learning training
- Statistical Numerical Approximation
- Bayesian Numerical Homogenization
This page was built for publication: Deep regularization and direct training of the inner layers of neural networks with kernel flows