arXiv Open Access 2026

SPARQ: Spiking Early-Exit Neural Networks for Energy-Efficient Edge AI

Parth Patne Mahdi Taheri Ali Mahani Maksim Jenihhin Reza Mahani +1 lainnya
Lihat Sumber

Abstrak

Spiking neural networks (SNNs) offer inherent energy efficiency due to their event-driven computation model, making them promising for edge AI deployment. However, their practical adoption is limited by the computational overhead of deep architectures and the absence of input-adaptive control. This work presents SPARQ, a unified framework that integrates spiking computation, quantization-aware training, and reinforcement learning-guided early exits for efficient and adaptive inference. Evaluations across MLP, LeNet, and AlexNet architectures demonstrated that the proposed Quantised Dynamic SNNs (QDSNN) consistently outperform conventional SNNs and QSNNs, achieving up to 5.15% higher accuracy over QSNNs, over 330 times lower system energy compared to baseline SNNs, and over 90 percent fewer synaptic operations across different datasets. These results validate SPARQ as a hardware-friendly, energy-efficient solution for real-time AI at the edge.

Topik & Kata Kunci

Penulis (6)

P

Parth Patne

M

Mahdi Taheri

A

Ali Mahani

M

Maksim Jenihhin

R

Reza Mahani

C

Christian Herglotz

Format Sitasi

Patne, P., Taheri, M., Mahani, A., Jenihhin, M., Mahani, R., Herglotz, C. (2026). SPARQ: Spiking Early-Exit Neural Networks for Energy-Efficient Edge AI. https://arxiv.org/abs/2603.14380

Akses Cepat

Lihat di Sumber
Informasi Jurnal
Tahun Terbit
2026
Bahasa
en
Sumber Database
arXiv
Akses
Open Access ✓