BriefGPT.xyz
Jan, 2024
Translate-Distill:通过翻译和蒸馏学习跨语言稠密检索
Translate-Distill: Learning Cross-Language Dense Retrieval by Translation and Distillation
HTML
PDF
Eugene Yang, Dawn Lawrie, James Mayfield, Douglas W. Oard, Scott Miller
TL;DR
该研究提出了一种名为Translate-Distill的方法,它利用交叉编码器或CLIR交叉编码器的知识蒸馏来训练双编码器CLIR学生模型。
Abstract
Prior work on English monolingual retrieval has shown that a
cross-encoder
trained using a large number of relevance judgments for query-document pairs can be used as a teacher to train more efficient, but similarly effective,
→