Mathematical Research Data Initiative
Main page
Recent changes
Random page
Help about MediaWiki
Create a new Item
Create a new Property
Create a new EntitySchema
Merge two items
In other projects
Discussion
View source
View history
Purge
English
Log in

Sparse Deep Neural Networks Using L1,∞-Weight Normalization

From MaRDI portal
Publication:5155193
Jump to:navigation, search

DOI10.5705/ss.202018.0468zbMath1479.62083OpenAlexW3037054256MaRDI QIDQ5155193

No author found.

Publication date: 6 October 2021

Published in: Statistica Sinica (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.5705/ss.202018.0468

zbMATH Keywords

generalizationsparsityRademacher complexityoverfittingdeep neural networks


Mathematics Subject Classification ID

Artificial neural networks and deep learning (68T07) Neural nets and related approaches to inference from stochastic processes (62M45) Statistical aspects of big data and data science (62R07)


Related Items

Improve robustness and accuracy of deep neural network with \(L_{2,\infty}\) normalization


Uses Software

  • CIFAR
  • ImageNet


Cites Work

  • The sample complexity of pattern classification with neural networks: the size of the weights is more important than the size of the network
  • Unnamed Item
  • Unnamed Item
  • Unnamed Item
  • Unnamed Item
Retrieved from "https://portal.mardi4nfdi.de/w/index.php?title=Publication:5155193&oldid=19713044"
Tools
What links here
Related changes
Special pages
Printable version
Permanent link
Page information
MaRDI portal item
This page was last edited on 8 February 2024, at 16:17.
Privacy policy
About MaRDI portal
Disclaimers
Imprint
Powered by MediaWiki