Semantic Scholar Open Access 2022 24 sitasi

A High-Accuracy and Energy-Efficient CORDIC Based Izhikevich Neuron With Error Suppression and Compensation

Jipeng Wang Zixuan Peng Yi Zhan Yujie Li Guoyi Yu +2 lainnya

Abstrak

Bio-inspired neuron models are the key building blocks of brain-like neural networks for brain-science exploration and neuromorphic engineering applications. The efficient hardware design of bio-inspired neuron models is one of the challenges to implement brain-like neural networks, as the balancing of model accuracy, energy consumption and hardware cost is very challenging. This paper proposes a high-accuracy and energy-efficient Fast-Convergence COordinate Rotation DIgital Computer (FC-CORDIC) based Izhikevich neuron design. For ensuring the model accuracy, an error propagation model of the Izhikevich neuron is presented for systematic error analysis and effective error reduction. Parameter-Tuning Error Compensation (PTEC) method and Bitwidth-Extension Error Suppression (BEES) method are proposed to reduce the error of Izhikevich neuron design effectively. In addition, by utilizing the FC-CORDIC instead of conventional CORDIC for square calculation in the Izhikevich model, the redundant CORDIC iterations are removed and therefore, both the accumulated errors and required computation are effectively reduced, which significantly improve the accuracy and energy efficiency. An optimized fixed-point design of FC-CORDIC is also proposed to save hardware overhead while ensuring the accuracy. FPGA implementation results exhibit that the proposed Izhikevich neuron design can achieve high accuracy and energy efficiency with an acceptable hardware overhead, among the state-of-the-art designs.

Penulis (7)

J

Jipeng Wang

Z

Zixuan Peng

Y

Yi Zhan

Y

Yujie Li

G

Guoyi Yu

K

Kwen-Siong Chong

C

Chao Wang

Format Sitasi

Wang, J., Peng, Z., Zhan, Y., Li, Y., Yu, G., Chong, K. et al. (2022). A High-Accuracy and Energy-Efficient CORDIC Based Izhikevich Neuron With Error Suppression and Compensation. https://doi.org/10.1109/TBCAS.2022.3191004

Akses Cepat

PDF tidak tersedia langsung

Cek di sumber asli →
Lihat di Sumber doi.org/10.1109/TBCAS.2022.3191004
Informasi Jurnal
Tahun Terbit
2022
Bahasa
en
Total Sitasi
24×
Sumber Database
Semantic Scholar
DOI
10.1109/TBCAS.2022.3191004
Akses
Open Access ✓