BriefGPT.xyz
Oct, 2021
DistilHuBERT:基于层间蒸馏的BERT隐藏单元语音表示学习
DistilHuBERT: Speech Representation Learning by Layer-wise Distillation of Hidden-unit BERT
HTML
PDF
Heng-Jui Chang, Shu-wen Yang, Hung-yi Lee
TL;DR
本文介绍了DistilHuBERT,它是一种新的多任务学习框架,可从HuBERT模型中提取隐藏表示,节省了大量内存和训练时间成本,并且在十个不同的任务中保留了大多数性能,从而使得个人和设备上的SSL模型的预训练成为可能。
Abstract
self-supervised speech representation learning
methods like
wav2vec 2.0
and
hidden-unit bert
(HuBERT) leverage unlabeled speech data for p
→