Abstract meaning representations (AMRs) are broad-coverage sentence-level
semantic representations. AMRs represent sentences as rooted labeled directed
acyclic graphs. amr parsing is challenging partly due to the lack of annotated
alignments between nodes in the graphs and words in the
本文提出了一种新的端到端模型,将 AMR 分析视为对输入序列和增量构建的图表的双重决策序列,并通过多个注意力,推理和组合过程来回答两个关键问题:输入序列的哪一部分需要抽象,以及在图表的哪个位置构建新的概念,实验结果表明,所提出的模型相较于之前的方法,在解决 AMR 分析的准确性方面取得了很大进步,无需任何大规模预训练语言模型(如 Bert),我们的模型就已经超过之前的最新技术水平。