arXiv Open Access 2026

Continual Adaptation for Pacific Indigenous Speech Recognition

Yang Xiao Aso Mahmudi Nick Thieberger Eliathamby Ambikairajah Eun-Jung Holden +1 lainnya
Lihat Sumber

Abstrak

Speech foundation models struggle with low-resource Pacific Indigenous languages because of severe data scarcity. Furthermore, full fine-tuning risks catastrophic forgetting. To address this gap, we present an empirical study adapting models to real-world Pacific datasets. We investigate how data volume and linguistic features affect adaptation success. Specifically, we evaluate strategies including Full Fine-Tuning and Low-Rank Adaptation (LoRA). Additionally, we analyze a continual learning framework for sequentially acquiring multiple languages. We demonstrate that adapting to these distant languages causes severe internal representational drift. Consequently, these models face a strict plasticity and stability dilemma. While LoRA adapts well initially, it suffers from catastrophic forgetting during sequential learning. Ultimately, this study highlights the urgent need for robust adaptation strategies tailored to underrepresented languages.

Topik & Kata Kunci

Penulis (6)

Y

Yang Xiao

A

Aso Mahmudi

N

Nick Thieberger

E

Eliathamby Ambikairajah

E

Eun-Jung Holden

T

Ting Dang

Format Sitasi

Xiao, Y., Mahmudi, A., Thieberger, N., Ambikairajah, E., Holden, E., Dang, T. (2026). Continual Adaptation for Pacific Indigenous Speech Recognition. https://arxiv.org/abs/2603.06310

Akses Cepat

Lihat di Sumber
Informasi Jurnal
Tahun Terbit
2026
Bahasa
en
Sumber Database
arXiv
Akses
Open Access ✓