arXiv Open Access 2024

Locality-Sensitive Hashing-Based Efficient Point Transformer with Applications in High-Energy Physics

Siqi Miao Zhiyuan Lu Mia Liu Javier Duarte Pan Li
Lihat Sumber

Abstrak

This study introduces a novel transformer model optimized for large-scale point cloud processing in scientific domains such as high-energy physics (HEP) and astrophysics. Addressing the limitations of graph neural networks and standard transformers, our model integrates local inductive bias and achieves near-linear complexity with hardware-friendly regular operations. One contribution of this work is the quantitative analysis of the error-complexity tradeoff of various sparsification techniques for building efficient transformers. Our findings highlight the superiority of using locality-sensitive hashing (LSH), especially OR & AND-construction LSH, in kernel approximation for large-scale point cloud data with local inductive bias. Based on this finding, we propose LSH-based Efficient Point Transformer (HEPT), which combines E$^2$LSH with OR & AND constructions and is built upon regular computations. HEPT demonstrates remarkable performance on two critical yet time-consuming HEP tasks, significantly outperforming existing GNNs and transformers in accuracy and computational speed, marking a significant advancement in geometric deep learning and large-scale scientific data processing. Our code is available at https://github.com/Graph-COM/HEPT.

Topik & Kata Kunci

Penulis (5)

S

Siqi Miao

Z

Zhiyuan Lu

M

Mia Liu

J

Javier Duarte

P

Pan Li

Format Sitasi

Miao, S., Lu, Z., Liu, M., Duarte, J., Li, P. (2024). Locality-Sensitive Hashing-Based Efficient Point Transformer with Applications in High-Energy Physics. https://arxiv.org/abs/2402.12535

Akses Cepat

Lihat di Sumber
Informasi Jurnal
Tahun Terbit
2024
Bahasa
en
Sumber Database
arXiv
Akses
Open Access ✓