arXiv Open Access 2025

Parameter Aware Mamba Model for Multi-task Dense Prediction

Xinzhuo Yu Yunzhi Zhuge Sitong Gong Lu Zhang Pingping Zhang +1 lainnya
Lihat Sumber

Abstrak

Understanding the inter-relations and interactions between tasks is crucial for multi-task dense prediction. Existing methods predominantly utilize convolutional layers and attention mechanisms to explore task-level interactions. In this work, we introduce a novel decoder-based framework, Parameter Aware Mamba Model (PAMM), specifically designed for dense prediction in multi-task learning setting. Distinct from approaches that employ Transformers to model holistic task relationships, PAMM leverages the rich, scalable parameters of state space models to enhance task interconnectivity. It features dual state space parameter experts that integrate and set task-specific parameter priors, capturing the intrinsic properties of each task. This approach not only facilitates precise multi-task interactions but also allows for the global integration of task priors through the structured state space sequence model (S4). Furthermore, we employ the Multi-Directional Hilbert Scanning method to construct multi-angle feature sequences, thereby enhancing the sequence model's perceptual capabilities for 2D data. Extensive experiments on the NYUD-v2 and PASCAL-Context benchmarks demonstrate the effectiveness of our proposed method. Our code is available at https://github.com/CQC-gogopro/PAMM.

Topik & Kata Kunci

Penulis (6)

X

Xinzhuo Yu

Y

Yunzhi Zhuge

S

Sitong Gong

L

Lu Zhang

P

Pingping Zhang

H

Huchuan Lu

Format Sitasi

Yu, X., Zhuge, Y., Gong, S., Zhang, L., Zhang, P., Lu, H. (2025). Parameter Aware Mamba Model for Multi-task Dense Prediction. https://arxiv.org/abs/2511.14503

Akses Cepat

Lihat di Sumber
Informasi Jurnal
Tahun Terbit
2025
Bahasa
en
Sumber Database
arXiv
Akses
Open Access ✓