BriefGPT.xyz
Apr, 2020
面向任务的对话状态生成的长上下文建模
Modeling Long Context for Task-Oriented Dialogue State Generation
HTML
PDF
Jun Quan, Deyi Xiong
TL;DR
本论文提出了一种基于TRADE的可转移对话状态生成器,结合简单有效的话语标记技术和双向语言模型的多任务学习模型,旨在解决基线性能在长对话上大幅下降的问题,最终在MultiWOZ 2.0数据集上实现了52.04%的联合目标准确性,相对提高了7.03%,成为了最新的最先进技术。
Abstract
Based on the recently proposed transferable
dialogue state generator
(TRADE) that predicts dialogue states from utterance-concatenated dialogue context, we propose a
multi-task learning
model with a simple yet ef
→