BriefGPT.xyz
Apr, 2020
多语言序列标注的结构级知识蒸馏
Structure-Level Knowledge Distillation For Multilingual Sequence Labeling
HTML
PDF
Xinyu Wang, Yong Jiang, Nguyen Bach, Tao Wang, Fei Huang...
TL;DR
该研究提出使用知识蒸馏减少多语言模型与单语言模型之间性能差距,实验结果显示我们的方法优于多个基准模型并具有更强的零次通用性。
Abstract
multilingual sequence labeling
is a task of predicting label sequences using a single unified model for multiple languages. Compared with relying on multiple
monolingual models
, using a
→