BriefGPT.xyz
Jun, 2022
AMR 对齐:关注交叉注意力
AMR Alignment: Paying Attention to Cross-Attention
HTML
PDF
Pere-Lluís Huguet Cabot, Abelardo Carlos Martínez Lorenzo, Roberto Navigli
TL;DR
本研究探讨了基于Transformer模型的句法分析器的交叉注意力机制及其对句子结构与语义间的对齐能力,提出了一种有效的交叉关注权重的监督和引导方法并通过实验验证其效果。
Abstract
With the surge of
transformer models
, many have investigated how
attention
acts on the learned representations. However,
attention
is stil
→