arXiv Open Access 2022

SplitNets: Designing Neural Architectures for Efficient Distributed Computing on Head-Mounted Systems

Xin Dong Barbara De Salvo Meng Li Chiao Liu Zhongnan Qu +2 lainnya
Lihat Sumber

Abstrak

We design deep neural networks (DNNs) and corresponding networks' splittings to distribute DNNs' workload to camera sensors and a centralized aggregator on head mounted devices to meet system performance targets in inference accuracy and latency under the given hardware resource constraints. To achieve an optimal balance among computation, communication, and performance, a split-aware neural architecture search framework, SplitNets, is introduced to conduct model designing, splitting, and communication reduction simultaneously. We further extend the framework to multi-view systems for learning to fuse inputs from multiple camera sensors with optimal performance and systemic efficiency. We validate SplitNets for single-view system on ImageNet as well as multi-view system on 3D classification, and show that the SplitNets framework achieves state-of-the-art (SOTA) performance and system latency compared with existing approaches.

Topik & Kata Kunci

Penulis (7)

X

Xin Dong

B

Barbara De Salvo

M

Meng Li

C

Chiao Liu

Z

Zhongnan Qu

H

H. T. Kung

Z

Ziyun Li

Format Sitasi

Dong, X., Salvo, B.D., Li, M., Liu, C., Qu, Z., Kung, H.T. et al. (2022). SplitNets: Designing Neural Architectures for Efficient Distributed Computing on Head-Mounted Systems. https://arxiv.org/abs/2204.04705

Akses Cepat

Lihat di Sumber
Informasi Jurnal
Tahun Terbit
2022
Bahasa
en
Sumber Database
arXiv
Akses
Open Access ✓