Semantic Scholar Open Access 2018 2059 sitasi

Universal Sentence Encoder

Daniel Matthew Cer Yinfei Yang Sheng-yi Kong Nan Hua Nicole Limtiaco +8 lainnya

Abstrak

We present models for encoding sentences into embedding vectors that specifically target transfer learning to other NLP tasks. The models are efficient and result in accurate performance on diverse transfer tasks. Two variants of the encoding models allow for trade-offs between accuracy and compute resources. For both variants, we investigate and report the relationship between model complexity, resource consumption, the availability of transfer task training data, and task performance. Comparisons are made with baselines that use word level transfer learning via pretrained word embeddings as well as baselines do not use any transfer learning. We find that transfer learning using sentence embeddings tends to outperform word level transfer. With transfer learning via sentence embeddings, we observe surprisingly good performance with minimal amounts of supervised training data for a transfer task. We obtain encouraging results on Word Embedding Association Tests (WEAT) targeted at detecting model bias. Our pre-trained sentence encoding models are made freely available for download and on TF Hub.

Topik & Kata Kunci

Penulis (13)

D

Daniel Matthew Cer

Y

Yinfei Yang

S

Sheng-yi Kong

N

Nan Hua

N

Nicole Limtiaco

R

Rhomni St. John

N

Noah Constant

M

Mario Guajardo-Cespedes

S

Steve Yuan

C

C. Tar

Y

Yun-Hsuan Sung

B

B. Strope

R

R. Kurzweil

Format Sitasi

Cer, D.M., Yang, Y., Kong, S., Hua, N., Limtiaco, N., John, R.S. et al. (2018). Universal Sentence Encoder. https://www.semanticscholar.org/paper/a76706d350b8c483a3aff73e61b91d15b5687335

Akses Cepat

PDF tidak tersedia langsung

Cek di sumber asli →
Lihat di Sumber
Informasi Jurnal
Tahun Terbit
2018
Bahasa
en
Total Sitasi
2059×
Sumber Database
Semantic Scholar
Akses
Open Access ✓