arXiv Open Access 2025

Lightweight Baselines for Medical Abstract Classification: DistilBERT with Cross-Entropy as a Strong Default

Jiaqi Liu Tong Wang Su Liu Xin Hu Ran Tong +2 lainnya
Lihat Sumber

Abstrak

The research evaluates lightweight medical abstract classification methods to establish their maximum performance capabilities under financial budget restrictions. On the public medical abstracts corpus, we finetune BERT base and Distil BERT with three objectives cross entropy (CE), class weighted CE, and focal loss under identical tokenization, sequence length, optimizer, and schedule. DistilBERT with plain CE gives the strongest raw argmax trade off, while a post hoc operating point selection (validation calibrated, classwise thresholds) sub stantially improves deployed performance; under this tuned regime, focal benefits most. We report Accuracy, Macro F1, and WeightedF1, release evaluation artifacts, and include confusion analyses to clarify error structure. The practical takeaway is to start with a compact encoder and CE, then add lightweight calibration or thresholding when deployment requires higher macro balance.

Topik & Kata Kunci

Penulis (7)

J

Jiaqi Liu

T

Tong Wang

S

Su Liu

X

Xin Hu

R

Ran Tong

L

Lanruo Wang

J

Jiexi Xu

Format Sitasi

Liu, J., Wang, T., Liu, S., Hu, X., Tong, R., Wang, L. et al. (2025). Lightweight Baselines for Medical Abstract Classification: DistilBERT with Cross-Entropy as a Strong Default. https://arxiv.org/abs/2510.10025

Akses Cepat

Lihat di Sumber
Informasi Jurnal
Tahun Terbit
2025
Bahasa
en
Sumber Database
arXiv
Akses
Open Access ✓