BriefGPT.xyz
Feb, 2025
无模型退化的终身序列知识编辑
Lifelong Sequential Knowledge Editing without Model Degradation
HTML
PDF
Akshat Gupta, Phudish Prateepamornkul, Maochuan Lu, Ahmed Alaa, Thomas Hartvigsen...
TL;DR
本研究解决了当前知识编辑过程中大规模序列编辑导致模型性能显著下降的问题。提出的ENCORE方法通过控制过拟合和范数增长,实现了在不损失后续性能的情况下进行高达10,000次的序列编辑。研究表明,ENCORE比现有方法快速61%到64%,为知识编辑的长期应用提供了有效解决方案。
Abstract
Prior work in parameter-modifying
Knowledge Editing
has shown that large-scale sequential editing leads to significant model degradation. In this paper, we study the reasons behind this and scale sequential
Knowledge Ed
→