An Energy Efficient STDP-Based SNN Architecture With On-Chip Learning


METADATA ONLY
Loading...

Date

2022-12

Publication Type

Journal Article

ETH Bibliography

yes

Citations

Altmetric
METADATA ONLY

Data

Rights / License

Abstract

In this paper, we propose a spike-time based unsupervised learning method using spiking-timing dependent plasticity (STDP). A simplified linear STDP learning rule is proposed for the energy efficient weight updates. To reduce unnecessary computations for the input spike values, a stop mechanism of the forward pass is introduced in the forward pass. In addition, a hardware-friendly input quantization scheme is used to reduce the computational complexities in both the encoding phase and the forward pass. We construct a two-layer fully-connected spiking neuron network (SNN) based on the proposed method. Compared to general rate-based SNNs trained by STDP, the proposed method reduces the complexity of network architecture (an extra inhibitory layer is not needed) and the computations of synaptic weight updates. According to the fixed-point simulation with 9-bit synaptic weights, the proposed SNN with 6144 excitatory neurons achieves 96% of recognition accuracy on MNIST dataset without any supervision. An SNN processor that contains 384 excitatory neurons with on-chip learning capability is designed and implemented with 28 nm CMOS technology based on the proposed low complexity methods. The SNN processor achieves an accuracy of 93% on MNIST dataset. The implementation results show that the SNN processor achieves a throughput of 277.78k FPS with 0.50 mu J/inference energy consuming in inference mode, and a throughput of 211.77k FPS with 0.66 mu J/learning energy consuming in learning mode.

Permanent link

Publication status

published

Editor

Book title

Volume

69 (12)

Pages / Article No.

5147 - 5158

Publisher

IEEE

Event

Edition / version

Methods

Software

Geographic location

Date collected

Date created

Subject

On-chip learning; spiking neural network; spiking-timing dependent plasticity; temporal coding; unsupervised learning

Organisational unit

Notes

Funding

Related publications and datasets