Complex node interactions are common in knowledge graphs, and these
interactions also contain rich knowledge information. However, traditional
methods usually treat a triple as a training unit during the knowledge
representation learning (KRL) procedure, neglecting contextualized infor
本文介绍了知识图谱(KGs)以及其与关系知识的上下文信息的整合,重点讨论了基于三元组的 KGs 存在的局限性和上下文 KGs 的优势,并提出了 KGR$^3$,一个利用大型语言模型(LLMs)进行 KG 推理的范例,实验证明 KGR$^3$ 显著提高了 KG 补全和 KG 问答任务的性能,验证了将上下文信息整合到 KG 表示和推理中的有效性。
Knowledge-Enhanced Pre-trained Language Models improve downstream NLP tasks in closed domains by injecting knowledge facts from Knowledge Graphs using the proposed KANGAROO framework that captures implicit graph structure and employs data augmentation for better performance.