BriefGPT.xyz
Feb, 2022
自注意力用于不完整话语改写
Self-attention for incomplete utterance rewriting
HTML
PDF
Yong Zhang, Zhitao Li, Jianzong Wang, Ning Cheng, Jing Xiao
TL;DR
本文提出了一种新的方法,通过直接从transformer模型的自注意力权重矩阵提取coreference和omission关系,并相应地编辑原始文本,从而生成完整的话语,从而实现自然语言处理中的不完整话语改写任务。
Abstract
incomplete utterance rewriting
(IUR) has recently become an essential task in
nlp
, aiming to complement the incomplete utterance with sufficient context information for comprehension. In this paper, we propose a
→