GlocalDualNet: Disentangling Scale and Representation for Few-Shot Remote Sensing Segmentation
Abstrak
The core task of semantic segmentation is to assign predefined category labels to each pixel in an image, thereby distinguishing between different objects and backgrounds. Few-shot semantic segmentation (FSS) is a specialized semantic segmentation task that aims to accurately segment pixel-level targets of novel classes in query images, relying only on a limited number of annotated support samples to enable rapid adaptation to unseen categories without extensive labeled data. FSS in remote sensing imagery is a critical yet challenging task, primarily due to two intrinsic data characteristics: extreme scale variations among target objects and significant intraclass heterogeneity. These challenges severely degrade the performance of existing FSS methods, which often rely on single, global prototypes and are not explicitly designed for such variability. To address these limitations, we propose GlocalDualNet, a novel FSS framework tailored for remote sensing applications. GlocalDualNet integrates two core technical contributions. First, a multiscale support prototype extraction module generates a set of heterogeneous local prototypes in addition to a conventional global prototype. This approach mitigates the spatial detail loss associated with global-only representations and provides a more comprehensive feature signature for matching. Second, a dual-branch segmentation network is designed to explicitly disentangle the feature learning process for large- and small-scale targets, thereby improving segmentation accuracy across disparate scales. Experimental validation on the iSAID-5<sup>i</sup> benchmark dataset demonstrates that our proposed modules yield a notable 2.13% improvement in segmentation accuracy, establishing the efficacy of the GlocalDualNet framework.
Topik & Kata Kunci
Penulis (6)
Hengren Tang
Yaxuan Jia
Jiacheng Cheng
Yang Mu
Yi Wu
Xiwen Yao
Akses Cepat
PDF tidak tersedia langsung
Cek di sumber asli →- Tahun Terbit
- 2026
- Sumber Database
- DOAJ
- DOI
- 10.1109/JSTARS.2026.3663646
- Akses
- Open Access ✓