error
Kurzer Serviceunterbruch am Montag, 8. Dezember 2025, 12 bis 13 Uhr. Sie können in diesem Zeitraum keine neuen Dokumente hochladen oder bestehende Einträge bearbeiten. Das Login wird in diesem Zeitraum deaktiviert. Grund: Wartungsarbeiten // Short service interruption on Monday, December 8, 2025, 12.00 – 13.00. During this time, you won’t be able to upload new documents or edit existing records. The login will be deactivated during this time. Reason: maintenance work
 

Temporal Pattern Coding in Deep Spiking Neural Networks


METADATA ONLY
Loading...

Date

2021

Publication Type

Conference Paper

ETH Bibliography

yes

Citations

Altmetric
METADATA ONLY

Data

Rights / License

Abstract

Deep Artificial Neural Networks (ANNs) employ a simplified analog neuron model that mimics the rate transfer function of integrate-and-fire neurons. In Spiking Neural Networks (SNNs), the predominant information transmission method is based on rate codes. This code is inefficient from a hardware perspective because the number of transmitted spikes is proportional to the encoded analog value. Alternate codes such as temporal codes that are based on single spikes are difficult to scale up for large networks due to their sensitivity to spike timing noise. Here we present a study of an encoding scheme based on temporal spike patterns. This scheme inherits the efficiency of temporal codes but retains the robustness of rate codes. The pattern code is evaluated on MNIST, CIFAR-10, and ImageNet image classification tasks. We compare the network performance of ANNs, rate-coded SNNs, and temporal-coded SNNs, using the classification error and operation count as performance metrics. We also estimate the power consumption of the digital logic needed for the operations associated with each encoding type, and the impact of the bit precision of the weights and activations. On ImageNet, the temporal pattern code achieves up to 35times reduction in the estimated power consumption compared to the rate-coded SNN, and 42times compared to the ANN. The classification error of the pattern-coded SNN is increased by < 1% compared to the ANN, and decreased by 2% compared to the rate-coded SNN.

Publication status

published

Editor

Book title

2021 International Joint Conference on Neural Networks (IJCNN)

Journal / series

Volume

Pages / Article No.

9533837

Publisher

IEEE

Event

2021 International Joint Conference on Neural Networks (IJCNN 2021)

Edition / version

Methods

Software

Geographic location

Date collected

Date created

Subject

Organisational unit

02533 - Institut für Neuroinformatik / Institute of Neuroinformatics

Notes

Funding

Related publications and datasets