BriefGPT.xyz
Oct, 2020
利用预训练的层次Transformer进行无监督抽取式摘要
Unsupervised Extractive Summarization by Pre-training Hierarchical Transformers
HTML
PDF
Shusheng Xu, Xingxing Zhang, Yi Wu, Furu Wei, Ming Zhou
TL;DR
本文提出了一种使用transformer自注意力机制进行无监督文本摘要提取的方法,并在CNN / DailyMail和New York Times数据集上证明其优于现有的无监督模型,且不太依赖于句子位置。
Abstract
unsupervised
extractive
document
summarization
aims to select important sentences from a document without using labeled summaries during t
→