arXiv Open Access 2019

Negated and Misprimed Probes for Pretrained Language Models: Birds Can Talk, But Cannot Fly

Nora Kassner Hinrich Schütze
Lihat Sumber

Abstrak

Building on Petroni et al. (2019), we propose two new probing tasks analyzing factual knowledge stored in Pretrained Language Models (PLMs). (1) Negation. We find that PLMs do not distinguish between negated ("Birds cannot [MASK]") and non-negated ("Birds can [MASK]") cloze questions. (2) Mispriming. Inspired by priming methods in human psychology, we add "misprimes" to cloze questions ("Talk? Birds can [MASK]"). We find that PLMs are easily distracted by misprimes. These results suggest that PLMs still have a long way to go to adequately learn human-like factual knowledge.

Topik & Kata Kunci

Penulis (2)

N

Nora Kassner

H

Hinrich Schütze

Format Sitasi

Kassner, N., Schütze, H. (2019). Negated and Misprimed Probes for Pretrained Language Models: Birds Can Talk, But Cannot Fly. https://arxiv.org/abs/1911.03343

Akses Cepat

Lihat di Sumber
Informasi Jurnal
Tahun Terbit
2019
Bahasa
en
Sumber Database
arXiv
Akses
Open Access ✓