BriefGPT.xyz
Feb, 2020
SBERT-WK:通过解剖基于BERT的词模型的句子嵌入方法
SBERT-WK: A Sentence Embedding Method by Dissecting BERT-based Word Models
HTML
PDF
Bin Wang, C. -C. Jay Kuo
TL;DR
本文介绍了一种基于BERT的词模型的高质量句子嵌入方法——SBERT-WK,它通过对词表示所张成空间的几何分析,研究了深度上下文模型词表示的分层模式,并在语义文本相似度和下游监督任务中评估了其性能,实验结果表明SBERT-WK取得了最先进的性能。
Abstract
sentence embedding
is an important research topic in natural language processing (NLP) since it can transfer knowledge to downstream tasks. Meanwhile, a contextualized word representation, called
bert
, achieves t
→