BriefGPT.xyz
Jun, 2020
任务导向对话中的自然语言理解加速
Accelerating Natural Language Understanding in Task-Oriented Dialog
HTML
PDF
Ojas Ahuja, Shrey Desai
TL;DR
本研究使用结构化剪枝方法对卷积模型进行压缩,比BERT模型的性能差异不大,模型参数少于100K,适用于移动设备,并在CPU上比DistilBERT快63倍。
Abstract
task-oriented dialog models
typically leverage complex
neural architectures
and large-scale, pre-trained
transformers
to achieve state-of-
→