EMNLPDec, 2023

ReasoningLM: 在预训练语言模型中实现知识图谱中的结构子图推理以解决问题回答

TL;DRQuestion Answering over Knowledge Graph (KGQA) aims to seek answer entities for the natural language question from a large-scale Knowledge Graph. To better perform reasoning on KG, recent work typically adopts a pre-trained language model (PLM) and a graph neural network (GNN) module, but these are not closely integrated. This paper proposes ReasoningLM, a more capable PLM that directly supports subgraph reasoning for KGQA, outperforming state-of-the-art models.