BriefGPT.xyz
Apr, 2022
通过非监督加权合并源嵌入学习元词嵌入
Learning Meta Word Embeddings by Unsupervised Weighted Concatenation of Source Embeddings
HTML
PDF
Danushka Bollegala
TL;DR
本文研究了元词向量嵌入(meta-embedding)的学习方法,提出了利用加权拼接来学习更准确和广泛覆盖面的词向量的两种无监督方法,并在多个基准数据集上进行试验,结果表明这些加权拼接的 meta-embedding 方法优于以前的 meta-embedding 学习方法。
Abstract
Given multiple source
word embeddings
learnt using diverse algorithms and lexical resources, meta word embedding learning methods attempt to learn more accurate and wide-coverage
word embeddings
. Prior work on
→