BriefGPT.xyz
Jun, 2022
低资源口语理解的瓶颈低秩变换器
Bottleneck Low-rank Transformers for Low-resource Spoken Language Understanding
HTML
PDF
Pu Wang, Hugo Van hamme
TL;DR
本文介绍了如何使用变形器结构并应用群稀疏技术实现拥有较高精度、更小规模SLU模型的生成,从而避免使用预先训练的参数较多的大型模型。
Abstract
End-to-end
spoken language understanding
(SLU) systems benefit from
pretraining
on large corpora, followed by fine-tuning on application-specific data. The resulting models are too large for on-edge applications.
→