DOAJ Open Access 2025

A Dense Bootstrap Contrastive Learning Method With 3-D Dynamic Convolution for Few-Shot PolSAR Image Classification

Nana Jiang Wenbo Zhao Jiao Guo Xiuya Dong Jubo Zhu

Abstrak

High-quality labeled samples of polarimetric synthetic aperture radar (PolSAR) images are relatively scarce. Therefore, achieving optimal classification performance with limited labeled samples has become a significant challenge in PolSAR image classification tasks. Existing deep learning methods not only rely on large amounts of labeled samples but also often face limitations in classification accuracy when handling multiinstance PolSAR image classification tasks. In response to this challenge, we propose a few-shot PolSAR image classification method based on dense bootstrap contrastive learning with 3-D dynamic convolution (DBCL-3DDC). The design of 3DDC enhances the feature extraction ability of the network for complex data. The DBCL learns global and local representations in a 30% and 70% ratio respectively, heuristically extracting feature representations for multiinstance PolSAR images. More importantly, during the pretraining phase, we design a multilevel contrastive learning strategy that fully utilizes both global and local instance representations without requiring labeled samples. The effectiveness of the proposed method is validated through experiments on three different datasets. Notably, on the Flevoland 1989 dataset, DBCL-3DDC achieves an overall accuracy of 97.29% using only 0.2% of labeled samples.

Penulis (5)

N

Nana Jiang

W

Wenbo Zhao

J

Jiao Guo

X

Xiuya Dong

J

Jubo Zhu

Format Sitasi

Jiang, N., Zhao, W., Guo, J., Dong, X., Zhu, J. (2025). A Dense Bootstrap Contrastive Learning Method With 3-D Dynamic Convolution for Few-Shot PolSAR Image Classification. https://doi.org/10.1109/JSTARS.2025.3594605

Akses Cepat

PDF tidak tersedia langsung

Cek di sumber asli →
Lihat di Sumber doi.org/10.1109/JSTARS.2025.3594605
Informasi Jurnal
Tahun Terbit
2025
Sumber Database
DOAJ
DOI
10.1109/JSTARS.2025.3594605
Akses
Open Access ✓