Show simple item record

dc.contributor.author
Lee, Jun H.
dc.contributor.author
Delbrück, Tobias
dc.contributor.author
Pfeiffer, Michael
dc.date.accessioned
2019-05-29T11:40:42Z
dc.date.available
2017-06-12T16:22:47Z
dc.date.available
2019-05-29T11:40:42Z
dc.date.issued
2016-11-08
dc.identifier.issn
1662-453X
dc.identifier.issn
1662-4548
dc.identifier.other
10.3389/fnins.2016.00508
en_US
dc.identifier.uri
http://hdl.handle.net/20.500.11850/122923
dc.identifier.doi
10.3929/ethz-b-000122923
dc.description.abstract
Deep spiking neural networks (SNNs) hold the potential for improving the latency and energy efficiency of deep neural networks through data-driven event-based computation. However, training such networks is difficult due to the non-differentiable nature of spike events. In this paper, we introduce a novel technique, which treats the membrane potentials of spiking neurons as differentiable signals, where discontinuities at spike times are considered as noise. This enables an error backpropagation mechanism for deep SNNs that follows the same principles as in conventional deep networks, but works directly on spike signals and membrane potentials. Compared with previous methods relying on indirect training and conversion, our technique has the potential to capture the statistics of spikes more precisely. We evaluate the proposed framework on artificially generated events from the original MNIST handwritten digit benchmark, and also on the N-MNIST benchmark recorded with an event-based dynamic vision sensor, in which the proposed method reduces the error rate by a factor of more than three compared to the best previous SNN, and also achieves a higher accuracy than a conventional convolutional neural network (CNN) trained and tested on the same data. We demonstrate in the context of the MNIST task that thanks to their event-driven operation, deep SNNs (both fully connected and convolutional) trained with our method achieve accuracy equivalent with conventional neural networks. In the N-MNIST example, equivalent accuracy is achieved with about five times fewer computational operations.
en_US
dc.format
application/pdf
en_US
dc.language.iso
en
en_US
dc.publisher
Frontiers Media
dc.rights.uri
http://creativecommons.org/licenses/by/4.0/
dc.subject
Spiking neural network
en_US
dc.subject
Deep neural network
en_US
dc.subject
Backpropagation
en_US
dc.subject
Neuromorphic
en_US
dc.subject
DVS
en_US
dc.subject
MNIST
en_US
dc.subject
N-MNIST
en_US
dc.title
Training Deep Spiking Neural Networks Using Backpropagation
en_US
dc.type
Journal Article
dc.rights.license
Creative Commons Attribution 4.0 International
ethz.journal.title
Frontiers in Neuroscience
ethz.journal.volume
10
en_US
ethz.journal.abbreviated
Front Neurosci
ethz.pages.start
508
en_US
ethz.size
13 p.
en_US
ethz.version.deposit
publishedVersion
en_US
ethz.identifier.wos
ethz.identifier.nebis
009497874
ethz.publication.place
Lausanne
ethz.publication.status
published
en_US
ethz.leitzahl
03453 - Douglas, Rodney J.
en_US
ethz.leitzahl.certified
03453 - Douglas, Rodney J.
ethz.date.deposited
2017-06-12T16:23:49Z
ethz.source
ECIT
ethz.identifier.importid
imp593654e3c1d4667760
ethz.ecitpid
pub:185261
ethz.eth
yes
en_US
ethz.availability
Open access
en_US
ethz.rosetta.installDate
2017-07-13T03:30:11Z
ethz.rosetta.lastUpdated
2024-02-02T08:10:19Z
ethz.rosetta.versionExported
true
ethz.COinS
ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.atitle=Training%20Deep%20Spiking%20Neural%20Networks%20Using%20Backpropagation&rft.jtitle=Frontiers%20in%20Neuroscience&rft.date=2016-11-08&rft.volume=10&rft.spage=508&rft.issn=1662-453X&1662-4548&rft.au=Lee,%20Jun%20H.&Delbr%C3%BCck,%20Tobias&Pfeiffer,%20Michael&rft.genre=article&rft_id=info:doi/10.3389/fnins.2016.00508&
 Search print copy at ETH Library

Files in this item

Thumbnail

Publication type

Show simple item record