YOLO-AFP: A More Robust Network for Aerial Object Detection
Abstrak
In practical applications of aerial object detection, real-time uncrewed aerial vehicle (UAV) imagery is often affected by noise, low light, and cloud occlusion, leading to poor image quality. The performance of mainstream UAV object detection algorithms tends to degrade when applied to such imagery, as these models are typically trained and evaluated on clean datasets. To address these challenges, we propose a robust YOLO-based network, YOLO-atrous feature pyramid (AFP), which integrates an AFP module. This allows the model to generalize effectively under various corrupted conditions, despite being trained only on clean data. First, we introduced AFP module, which employs atrous convolutions with varying dilation rates, and integrate it into the path aggregation network to expand the receptive field. This enhancement allows the model to better capture object-background relationships and reduce feature corruption caused by local pixel changes. Second, we propose a robust ResNet-spatial pyramid pooling fast (SPPF) as the backbone network, which retains strong feature extraction capabilities while having fewer residual connections compared to YOLO’s Darknet. This design effectively mitigates the impact of corrupted image features on subsequent feature extraction. To validate the effectiveness of our method, we constructed two new datasets based on the DOTAv1.0 dataset, named DOTA-HC and DOTA-HCloud. Experimental results demonstrate that on the DOTA-HC dataset, YOLO-AFP achieved an mean performance under corruption of 60.8% and an relative performance under corruption (rPC) of 80.3%, outperforming the best real-time detection model by 1.5% and 2%, respectively. On the DOTA-HCloud dataset, YOLO-AFP achieved an rPC of 88.5%, surpassing the top model by 1.1% .
Topik & Kata Kunci
Penulis (7)
Xue Li
Ziang Wang
Xueyu Chen
Wangbin Li
Kaimin Sun
Zuomei Lai
Peipei Zhu
Akses Cepat
PDF tidak tersedia langsung
Cek di sumber asli →- Tahun Terbit
- 2025
- Sumber Database
- DOAJ
- DOI
- 10.1109/JSTARS.2025.3610115
- Akses
- Open Access ✓