BriefGPT.xyz
Mar, 2016
贝叶斯神经词嵌入
Bayesian Neural Word Embedding
HTML
PDF
Oren Barkan
TL;DR
本文介绍了一种基于可扩展的贝叶斯神经词嵌入算法,该算法依赖于 Skip-Gram 目标的变分贝叶斯解决方案,并提供了详细的步骤描述。我们在六个不同的数据集上展示实验结果,说明该算法在词类比和相似性任务上的表现与原始 Skip-Gram 方法相当。
Abstract
Recently, several works in the domain of
natural language processing
presented successful methods for
word embedding
. Among them, the Skip-gram (SG) with negative sampling, known also as Word2Vec, advanced the st
→