Mathematical Research Data Initiative
Main page
Recent changes
Random page
Help about MediaWiki
Create a new Item
Create a new Property
Create a new EntitySchema
Merge two items
In other projects
Discussion
View source
View history
Purge
English
Log in

GXNOR-Net: training deep neural networks with ternary weights and activations without full-precision memory under a unified discretization framework

From MaRDI portal
Publication:2179802
Jump to:navigation, search

DOI10.1016/j.neunet.2018.01.010zbMath1434.68504arXiv1705.09283OpenAlexW2962939807WikidataQ50535241 ScholiaQ50535241MaRDI QIDQ2179802

Peng Jiao, Guoqi Li, Jing Pei, Lei Deng, Zhenzhi Wu

Publication date: 13 May 2020

Published in: Neural Networks (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1705.09283


zbMATH Keywords

discrete state transitionGXNOR-Netsparse binary networksternary neural networks


Mathematics Subject Classification ID

Artificial neural networks and deep learning (68T07)


Related Items (2)

Active Subspace of Neural Networks: Structural Analysis and Universal Attacks ⋮ GXNOR-Net


Uses Software

  • GitHub
  • AxNN
  • BinaryConnect
  • SpiNNaker


Cites Work

  • Unnamed Item
  • Unnamed Item
  • Memory Capacities for Synaptic and Structural Plasticity
  • Efficient Associative Computation with Discrete Synapses


This page was built for publication: GXNOR-Net: training deep neural networks with ternary weights and activations without full-precision memory under a unified discretization framework

Retrieved from "https://portal.mardi4nfdi.de/w/index.php?title=Publication:2179802&oldid=14701447"
Tools
What links here
Related changes
Special pages
Printable version
Permanent link
Page information
MaRDI portal item
This page was last edited on 2 February 2024, at 02:07.
Privacy policy
About MaRDI portal
Disclaimers
Imprint
Powered by MediaWiki