Semantic Scholar Open Access 2022 210 sitasi

A Review on Language Models as Knowledge Bases

Badr AlKhamissi Millicent Li Asli Celikyilmaz Mona T. Diab Marjan Ghazvininejad

Abstrak

Recently, there has been a surge of interest in the NLP community on the use of pretrained Language Models (LMs) as Knowledge Bases (KBs). Researchers have shown that LMs trained on a sufficiently large (web) corpus will encode a significant amount of knowledge implicitly in its parameters. The resulting LM can be probed for different kinds of knowledge and thus acting as a KB. This has a major advantage over traditional KBs in that this method requires no human supervision. In this paper, we present a set of aspects that we deem a LM should have to fully act as a KB, and review the recent literature with respect to those aspects.

Topik & Kata Kunci

Penulis (5)

B

Badr AlKhamissi

M

Millicent Li

A

Asli Celikyilmaz

M

Mona T. Diab

M

Marjan Ghazvininejad

Format Sitasi

AlKhamissi, B., Li, M., Celikyilmaz, A., Diab, M.T., Ghazvininejad, M. (2022). A Review on Language Models as Knowledge Bases. https://doi.org/10.48550/arXiv.2204.06031

Akses Cepat

Lihat di Sumber doi.org/10.48550/arXiv.2204.06031
Informasi Jurnal
Tahun Terbit
2022
Bahasa
en
Total Sitasi
210×
Sumber Database
Semantic Scholar
DOI
10.48550/arXiv.2204.06031
Akses
Open Access ✓