Semantic Scholar Open Access 2023 70 sitasi

NeRF-Det: Learning Geometry-Aware Volumetric Representation for Multi-View 3D Object Detection

Chenfeng Xu Bichen Wu Ji Hou Sam S. Tsai Ruilong Li +6 lainnya

Abstrak

We present NeRF-Det, a novel method for indoor 3D detection with posed RGB images as input. Unlike existing indoor 3D detection methods that struggle to model scene geometry, our method makes novel use of NeRF in an end-to-end manner to explicitly estimate 3D geometry, thereby improving 3D detection performance. Specifically, to avoid the significant extra latency associated with per-scene optimization of NeRF, we introduce sufficient geometry priors to enhance the generalizability of NeRF-MLP. Furthermore, we subtly connect the detection and NeRF branches through a shared MLP, enabling an efficient adaptation of NeRF to detection and yielding geometry-aware volumetric representations for 3D detection. Our method outperforms state-of-the-arts by 3.9 mAP and 3.1 mAP on the ScanNet and ARKITScenes benchmarks, respectively. We provide extensive analysis to shed light on how NeRF-Det works. As a result of our joint-training design, NeRF-Det is able to generalize well to unseen scenes for object detection, view synthesis, and depth estimation tasks without requiring per-scene optimization. Code is available at https://github.com/facebookresearch/NeRF-Det.

Topik & Kata Kunci

Penulis (11)

C

Chenfeng Xu

B

Bichen Wu

J

Ji Hou

S

Sam S. Tsai

R

Ruilong Li

J

Jialiang Wang

W

Wei Zhan

Z

Zijian He

P

Péter Vajda

K

K. Keutzer

M

M. Tomizuka

Format Sitasi

Xu, C., Wu, B., Hou, J., Tsai, S.S., Li, R., Wang, J. et al. (2023). NeRF-Det: Learning Geometry-Aware Volumetric Representation for Multi-View 3D Object Detection. https://doi.org/10.1109/ICCV51070.2023.02131

Akses Cepat

PDF tidak tersedia langsung

Cek di sumber asli →
Lihat di Sumber doi.org/10.1109/ICCV51070.2023.02131
Informasi Jurnal
Tahun Terbit
2023
Bahasa
en
Total Sitasi
70×
Sumber Database
Semantic Scholar
DOI
10.1109/ICCV51070.2023.02131
Akses
Open Access ✓