Pages that link to "Item:Q5223573"
From MaRDI portal
The following pages link to A note on the expressive power of deep rectified linear unit networks in high‐dimensional spaces (Q5223573):
Displaying 15 items.
- Provable approximation properties for deep neural networks (Q1742817) (← links)
- Linearized two-layers neural networks in high dimension (Q2039801) (← links)
- Error bounds for deep ReLU networks using the Kolmogorov-Arnold superposition theorem (Q2055036) (← links)
- High-dimensional distribution generation through deep neural networks (Q2062235) (← links)
- Optimal approximation rate of ReLU networks in terms of width and depth (Q2065073) (← links)
- The construction and approximation of ReLU neural network operators (Q2086452) (← links)
- Nonlinear approximation and (deep) ReLU networks (Q2117331) (← links)
- High-dimensional approximate <i>r</i>-nets (Q4575735) (← links)
- Deep Network With Approximation Error Being Reciprocal of Width to Power of Square Root of Depth (Q5004339) (← links)
- Deep ReLU Networks Overcome the Curse of Dimensionality for Generalized Bandlimited Functions (Q5079533) (← links)
- A note on the applications of one primary function in deep neural networks (Q5097859) (← links)
- Any Target Function Exists in a Neighborhood of Any Sufficiently Wide Random Network: A Geometrical Perspective (Q5131154) (← links)
- Deep Network Approximation for Smooth Functions (Q5155613) (← links)
- Neural network approximation: three hidden layers are enough (Q6054944) (← links)
- On mathematical modeling in image reconstruction and beyond (Q6200218) (← links)