BriefGPT.xyz
Jul, 2020
BERT排名器在蒸馏下的理解
Understanding BERT Rankers Under Distillation
HTML
PDF
Luyu Gao, Zhuyun Dai, Jamie Callan
TL;DR
本文研究了如何通过蒸馏将BERT中的搜索知识传递到更小的排名器中,实验表明,使用适当的蒸馏过程可以实现最高9倍速度提升,同时保持最先进的性能。
Abstract
deep language models
such as
bert
pre-trained on large corpus have given a huge performance boost to the state-of-the-art
information retrieval
→