An attention mechanism and fully connected layer-based dual-branch network for bioluminescent tomographic reconstruction
Abstrak
Bioluminescent tomography (BLT) is a noninvasive imaging technology that uses optical methods to study physiological and pathological processes at the cellular and molecular levels. It is a powerful tool for early diagnosis and treatment of tumors, as well as drug development. However, the simplified optical transmission models and the ill-posed inverse reconstruction limit its wide applications. The development of deep learning has provided new potential for extending the applications of optical BLT. Researchers have introduced various methods such as neural networks and self-attention mechanisms to improve reconstruction accuracy. Despite these efforts, weak energy points around the reconstructed light source center still impact the accuracy of restoration. In this study, we propose a dual-branch network based on a combination of attention mechanism and fully connected layers (FC-AM) to reduce centroid error and improve reconstruction performance. The network architecture consists of a fully connected (FC) subnetwork and an attention mechanism-based dual-branch (AMDB) subnetwork. The FC subnetwork is used to process input data. AMDB subnetwork is used for deep feature extraction, and captures feature information from different perspectives in parallel. Each branch of the AMDB subnetwork is composed of four AM subnets, which extract features through multilayer linear transformations and attention mechanisms. The outputs of the AMDB are combined through feature fusion to produce the final result. Numerical simulations and experimental results demonstrate that the FC-AM network significantly improves BLT reconstruction performance compared to existing methods (KNN_LC and AMLC networks), offering enhanced stability and accuracy.
Topik & Kata Kunci
Penulis (5)
Lin Wang
Chenrui Pan
Minghua Zhao
Xin Cao
Xueli Chen
Akses Cepat
- Tahun Terbit
- 2025
- Sumber Database
- DOAJ
- DOI
- 10.1142/S1793545825500105
- Akses
- Open Access ✓