BriefGPT.xyz
Jun, 2021
AUGNLG: 自训练数据增强的小样本自然语言生成
AUGNLG: Few-shot Natural Language Generation using Self-trained Data Augmentation
HTML
PDF
Xinnuo Xu, Guoyin Wang, Young-Bum Kim, Sungjin Lee
TL;DR
本篇论文提出了一种名为 AUGNLG 的新型数据增强方法,将自我训练的神经内存模型与少量训练的神经语言理解模型结合起来,自动从开放领域的文本中创建 MR-to-Text 数据,以提高自然语言生成的效率并在 FewShotWOZ 数据上表现优异。
Abstract
natural language generation
(NLG) is a key component in a
task-oriented dialogue system
, which converts the structured meaning representation (MR) to the natural language. For large-scale conversational systems,
→