Universal approximation with quadratic deep networks
From MaRDI portal
Publication:2185719
DOI10.1016/j.neunet.2020.01.007zbMath1434.68506arXiv1808.00098OpenAlexW2999571325WikidataQ89730519 ScholiaQ89730519MaRDI QIDQ2185719
Ge Wang, Jinjun Xiong, Feng-lei Fan
Publication date: 5 June 2020
Published in: Neural Networks (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1808.00098
Artificial neural networks and deep learning (68T07) Algorithms for approximation of functions (65D15)
Related Items (3)
Feasibility-based fixed point networks ⋮ Quadratic Neural Networks for Solving Inverse Problems ⋮ Shaping Dynamics With Multiple Populations in Low-Rank Recurrent Networks
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Approximation theory in tensor product spaces
- Why does deep and cheap learning work so well?
- Multilayer feedforward networks are universal approximators
- Probabilistic lower bounds for approximation by shallow perceptron networks
- Deep vs. shallow networks: An approximation theory perspective
- Universal approximation bounds for superpositions of a sigmoidal function
- Deep Convolutional Framelets: A General Deep Learning Framework for Inverse Problems
- Dependence of Computational Models on Input Dimension: Tractability of Approximation and Optimization Tasks
This page was built for publication: Universal approximation with quadratic deep networks