Feb, 2024
AttnLRP:面向 Transformer 的注意力感知分层相关传播
AttnLRP: Attention-Aware Layer-wise Relevance Propagation for Transformers
Reduan Achtibat, Sayed Mohammad Vakilzadeh Hatefi, Maximilian Dreyer, Aakriti Jain, Thomas Wiegand...
TL;DR扩展対面层级相关传递方法以处理注意力层可以实现对大型语言模型进行准确且高效的非黑盒推理解释。