arXiv Open Access 2025

THAT: Token-wise High-frequency Augmentation Transformer for Hyperspectral Pansharpening

Hongkun Jin Hongcheng Jiang Zejun Zhang Yuan Zhang Jia Fu +2 lainnya
Lihat Sumber

Abstrak

Transformer-based methods have demonstrated strong potential in hyperspectral pansharpening by modeling long-range dependencies. However, their effectiveness is often limited by redundant token representations and a lack of multi-scale feature modeling. Hyperspectral images exhibit intrinsic spectral priors (e.g., abundance sparsity) and spatial priors (e.g., non-local similarity), which are critical for accurate reconstruction. From a spectral-spatial perspective, Vision Transformers (ViTs) face two major limitations: they struggle to preserve high-frequency components--such as material edges and texture transitions--and suffer from attention dispersion across redundant tokens. These issues stem from the global self-attention mechanism, which tends to dilute high-frequency signals and overlook localized details. To address these challenges, we propose the Token-wise High-frequency Augmentation Transformer (THAT), a novel framework designed to enhance hyperspectral pansharpening through improved high-frequency feature representation and token selection. Specifically, THAT introduces: (1) Pivotal Token Selective Attention (PTSA) to prioritize informative tokens and suppress redundancy; (2) a Multi-level Variance-aware Feed-forward Network (MVFN) to enhance high-frequency detail learning. Experiments on standard benchmarks show that THAT achieves state-of-the-art performance with improved reconstruction quality and efficiency. The source code is available at https://github.com/kailuo93/THAT.

Topik & Kata Kunci

Penulis (7)

H

Hongkun Jin

H

Hongcheng Jiang

Z

Zejun Zhang

Y

Yuan Zhang

J

Jia Fu

T

Tingfeng Li

K

Kai Luo

Format Sitasi

Jin, H., Jiang, H., Zhang, Z., Zhang, Y., Fu, J., Li, T. et al. (2025). THAT: Token-wise High-frequency Augmentation Transformer for Hyperspectral Pansharpening. https://arxiv.org/abs/2508.08183

Akses Cepat

Lihat di Sumber
Informasi Jurnal
Tahun Terbit
2025
Bahasa
en
Sumber Database
arXiv
Akses
Open Access ✓