Semantic Scholar Open Access 2016 1145 sitasi

Dynamic Network Surgery for Efficient DNNs

Yiwen Guo Anbang Yao Yurong Chen

Abstrak

Deep learning has become a ubiquitous technology to improve machine intelligence. However, most of the existing deep models are structurally very complex, making them difficult to be deployed on the mobile platforms with limited computational power. In this paper, we propose a novel network compression method called dynamic network surgery, which can remarkably reduce the network complexity by making on-the-fly connection pruning. Unlike the previous methods which accomplish this task in a greedy way, we properly incorporate connection splicing into the whole process to avoid incorrect pruning and make it as a continual network maintenance. The effectiveness of our method is proved with experiments. Without any accuracy loss, our method can efficiently compress the number of parameters in LeNet-5 and AlexNet by a factor of $\bm{108}\times$ and $\bm{17.7}\times$ respectively, proving that it outperforms the recent pruning method by considerable margins. Code and some models are available at this https URL.

Topik & Kata Kunci

Penulis (3)

Y

Yiwen Guo

A

Anbang Yao

Y

Yurong Chen

Format Sitasi

Guo, Y., Yao, A., Chen, Y. (2016). Dynamic Network Surgery for Efficient DNNs. https://www.semanticscholar.org/paper/c220cdbcec6f92e4bc0f58c5fa6c1183105be1f9

Akses Cepat

PDF tidak tersedia langsung

Cek di sumber asli →
Lihat di Sumber
Informasi Jurnal
Tahun Terbit
2016
Bahasa
en
Total Sitasi
1145×
Sumber Database
Semantic Scholar
Akses
Open Access ✓