Jun, 2024
D2LLM:分解和蒸馏的大型语言模型用于语义搜索
D2LLM: Decomposed and Distilled Large Language Models for Semantic Search
Zihan Liao, Hang Yu, Jianguo Li, Jun Wang, Wei Zhang
TL;DRD2LLMs-Decomposed and Distilled LLMs combine efficient bi-encoders with pooling by multihead attention and interaction emulation module, achieving nuanced understanding and pre-computability, surpassing baselines in various tasks.