arXiv Open Access 2023

Controlled Text Generation with Natural Language Instructions

Wangchunshu Zhou Yuchen Eleanor Jiang Ethan Wilcox Ryan Cotterell Mrinmaya Sachan
Lihat Sumber

Abstrak

Large language models generate fluent texts and can follow natural language instructions to solve a wide range of tasks without task-specific training. Nevertheless, it is notoriously difficult to control their generation to satisfy the various constraints required by different applications. In this work, we present InstructCTG, a controlled text generation framework that incorporates different constraints by conditioning on natural language descriptions and demonstrations of the constraints. In particular, we first extract the underlying constraints of natural texts through a combination of off-the-shelf NLP tools and simple heuristics. We then verbalize the constraints into natural language instructions to form weakly supervised training data. By prepending natural language descriptions of the constraints and a few demonstrations, we fine-tune a pre-trained language model to incorporate various types of constraints. Compared to existing search-based or score-based methods, InstructCTG is more flexible to different constraint types and has a much smaller impact on the generation quality and speed because it does not modify the decoding procedure. Additionally, InstructCTG allows the model to adapt to new constraints without re-training through the use of few-shot task generalization and in-context learning abilities of instruction-tuned language models.

Topik & Kata Kunci

Penulis (5)

W

Wangchunshu Zhou

Y

Yuchen Eleanor Jiang

E

Ethan Wilcox

R

Ryan Cotterell

M

Mrinmaya Sachan

Format Sitasi

Zhou, W., Jiang, Y.E., Wilcox, E., Cotterell, R., Sachan, M. (2023). Controlled Text Generation with Natural Language Instructions. https://arxiv.org/abs/2304.14293

Akses Cepat

Lihat di Sumber
Informasi Jurnal
Tahun Terbit
2023
Bahasa
en
Sumber Database
arXiv
Akses
Open Access ✓