Balance-URSONet: A Real-Time Efficient Pose Spacecraft Estimation Network
Abstrak
The high-precision attitude estimation technique for non-cooperative targets in space, based on monocular cameras, has important application value in missions such as space debris removal, autonomous rendezvous and docking, and on-orbit services. However, due to the inherent missing information problem of monocular vision systems and the high complexity of target geometry, existing monocular pose estimation methods find it difficult to realize an effective balance between accuracy and computational efficiency. Current solutions commonly adopt deep neural network architectures to improve estimation accuracy; but, this method is often accompanied by the problems of a dramatic expansion of the number of model parameters and a significant increase in computational complexity, which limits its deployment and real-time inference capabilities in real spatial tasks. To address the above problems, this paper proposes a spacecraft pose estimation network, called Balance-URSONet, which weighs the trade-off between accuracy and the number of parameters, and makes the pose estimation model have a stronger feature extraction capability by innovatively using RepVGG as the feature extraction network. In order to effectively improve the performance and inference speed of the model, this paper proposes the feature excitation unit (FEU), which is able to flexibly adjust the feature representation of the network and thus optimize the utilization efficiency of spatial and channel information. The experimental results show that the Balance-URSONet proposed in this paper has excellent performance in the spacecraft pose estimation task, with an ESA score of 0.13 and a parameter count 13 times lower than that of URSONet.
Topik & Kata Kunci
Penulis (7)
Zhiyu Bi
Ming Chen
Guopeng Ding
Haodong Yan
Shihao Han
Zhaoxiong Li
Ruixue Ma
Akses Cepat
- Tahun Terbit
- 2025
- Sumber Database
- DOAJ
- DOI
- 10.3390/aerospace12090840
- Akses
- Open Access ✓