Hidden unit specialization in layered neural networks: ReLU vs. sigmoidal activation (Q2068413)
From MaRDI portal
| This is the item page for this Wikibase entity, intended for internal use and editing purposes. Please use this page instead for the normal view: Hidden unit specialization in layered neural networks: ReLU vs. sigmoidal activation |
scientific article; zbMATH DE number 7459771
| Language | Label | Description | Also known as |
|---|---|---|---|
| English | Hidden unit specialization in layered neural networks: ReLU vs. sigmoidal activation |
scientific article; zbMATH DE number 7459771 |
Statements
Hidden unit specialization in layered neural networks: ReLU vs. sigmoidal activation (English)
0 references
19 January 2022
0 references
neural networks
0 references
machine learning
0 references
statistical physics
0 references
0 references