BriefGPT.xyz
Sep, 2019
语言模型作为知识库吗?
Language Models as Knowledge Bases?
HTML
PDF
Fabio Petroni, Tim Rocktäschel, Patrick Lewis, Anton Bakhtin, Yuxiang Wu...
TL;DR
通过对预训练语言模型的深入分析,我们发现未经微调的BERT模型竞争传统NLP方法的关系知识,可以根据开放式关系进行查询,某些类型的事实知识比标准语言模型预训练方法更容易学习,并可以作为无监督的开放式QA系统的潜力展现。
Abstract
Recent progress in
pretraining language models
on large textual corpora led to a surge of improvements for downstream NLP tasks. Whilst learning linguistic knowledge, these models may also be storing
relational knowledg
→