May, 2020
利用 Transformer 语言模型简化段落级问题生成
Simplifying Paragraph-level Question Generation via Transformer Language Models
Luis Enrico Lopez, Diane Kathryn Cruz, Jan Christian Blaise Cruz, Charibeth Cheng
TL;DRQuestion generation is a natural language generation task that can be achieved with a single Transformer-based unidirectional language model, which outperforms QG baselines and produces high-quality questions that are relevant to their context paragraph and easy to answer, utilizing transfer learning without relying on auxiliary data.