Two-layer neural network on infinite-dimensional data: global optimization guarantee in the mean-field regime
From MaRDI portal
Publication:6611443
DOI10.1088/1742-5468/ad01b2MaRDI QIDQ6611443
Naoki Nishikawa, Atsushi Nitanda, Denny Wu, Taiji Suzuki
Publication date: 26 September 2024
Published in: Journal of Statistical Mechanics: Theory and Experiment (Search for Journal in Brave)
Cites Work
- Primal-dual subgradient methods for convex problems
- Functional multi-layer perceptron: A nonlinear tool for functional data analysis
- Optimal rates for the regularized least-squares algorithm
- Approximation of the invariant measure with an Euler scheme for stochastic PDEs driven by space-time white noise
- Duality and stability in extremum problems involving convex functions
- Weak approximation of stochastic partial differential equations: the nonlinear case
- Support Vector Machines
- Asymptotic evaluation of certain Markov process expectations for large time—III
- Nonparametric modelling for functional data: selected survey and tracks for future
- High-Dimensional Statistics
- Ergodicity for Infinite Dimensional Systems
- A mean field view of the landscape of two-layer neural networks
- Sampling can be faster than optimization
- Multilayer Perceptron with Functional Inputs: an Inverse Regression Approach
- NONPARAMETRIC REGRESSION ON FUNCTIONAL DATA: INFERENCE AND PRACTICAL ASPECTS
- Understanding Machine Learning
This page was built for publication: Two-layer neural network on infinite-dimensional data: global optimization guarantee in the mean-field regime