TL;DR该论文介绍了使用生成对抗网络(GANs)作为神经文本生成模型的训练方法,提高生成样本的质量,并通过 actor-critic 条件 GAN 与最大似然方法对比说明了其效果更好。
Abstract
neural text generation models are often autoregressive language models or
seq2seq models. These models generate text by sampling words sequentially, with
each word conditioned on the previous word, and are state-