BriefGPT.xyz
Mar, 2024
LLMLingua-2: 数据去噪 以提升高效及精确的无要求任务的提示压缩
LLMLingua-2: Data Distillation for Efficient and Faithful Task-Agnostic Prompt Compression
HTML
PDF
Zhuoshi Pan, Qianhui Wu, Huiqiang Jiang, Menglin Xia, Xufang Luo...
TL;DR
通过使用数据蒸馏方法,我们提出了一种基于Transformer编码器的令牌分类问题的任务无关提示压缩方法,以更高的效率压缩提示,降低延迟。
Abstract
This paper focuses on
task-agnostic prompt compression
for better generalizability and efficiency. Considering the redundancy in natural language, existing approaches compress prompts by removing tokens or lexical units according to their
→