TDSNN: From Deep Neural Networks to Deep Spike Neural Networks with Temporal-Coding

Authors

  • Lei Zhang Institute of Computing Technology of the Chinese Academy of Sciences
  • Shengyuan Zhou University of Chinese Academy of Sciences
  • Tian Zhi Institute of Computing Technology of the Chinese Academy of Sciences
  • Zidong Du Institute of Computing Technology of the Chinese Academy of Sciences
  • Yunji Chen Institute of Computing Technology, Chinese Academy of Sciences

DOI:

https://doi.org/10.1609/aaai.v33i01.33011319

Abstract

Continuous-valued deep convolutional networks (DNNs) can be converted into accurate rate-coding based spike neural networks (SNNs). However, the substantial computational and energy costs, which is caused by multiple spikes, limit their use in mobile and embedded applications. And recent works have shown that the newly emerged temporal-coding based SNNs converted from DNNs can reduce the computational load effectively. In this paper, we propose a novel method to convert DNNs to temporal-coding SNNs, called TDSNN. Combined with the characteristic of the leaky integrate-andfire (LIF) neural model, we put forward a new coding principle Reverse Coding and design a novel Ticking Neuron mechanism. According to our evaluation, our proposed method achieves 42% total operations reduction on average in large networks comparing with DNNs with no more than 0.5% accuracy loss. The evaluation shows that TDSNN may prove to be one of the key enablers to make the adoption of SNNs widespread.

Downloads

Published

2019-07-17

How to Cite

Zhang, L., Zhou, S., Zhi, T., Du, Z., & Chen, Y. (2019). TDSNN: From Deep Neural Networks to Deep Spike Neural Networks with Temporal-Coding. Proceedings of the AAAI Conference on Artificial Intelligence, 33(01), 1319-1326. https://doi.org/10.1609/aaai.v33i01.33011319

Issue

Section

AAAI Technical Track: Cognitive Modeling