A three layer neural network can represent any multivariate function
From MaRDI portal
Publication:2697707
DOI10.1016/j.jmaa.2023.127096OpenAlexW4319923063MaRDI QIDQ2697707
Publication date: 13 April 2023
Published in: Journal of Mathematical Analysis and Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2012.03016
Artificial intelligence (68Txx) Functions of several variables (26Bxx) Approximations and expansions (41Axx)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- On the approximation by neural networks with bounded number of neurons in hidden layers
- A note on the representation of continuous functions by linear superpositions
- On a constructive proof of Kolmogorov's superposition theorem
- Lower bounds for approximation by MLP neural networks
- A universal mapping for Kolmogorov's superposition theorem
- Computational aspects of Kolmogorov's superposition theorem
- Multilayer feedforward networks are universal approximators
- On the uniqueness of representation by linear superpositions
- Error bounds for deep ReLU networks using the Kolmogorov-Arnold superposition theorem
- On the representation by linear superpositions
- An improvement in the superposition theorem of Kolmogorov
- Ridge Functions and Applications in Neural Networks
- Superposition, reduction of multivariable problems, and approximation
- On the Structure of Continuous Functions of Several Variables
- Neural network approximation: three hidden layers are enough
- The Kolmogorov-Arnold representation theorem revisited
This page was built for publication: A three layer neural network can represent any multivariate function