BriefGPT.xyz
Oct, 2023
遮蔽硬关注变形器和布尔RASP准确识别无星语言
Masked Hard-Attention Transformers and Boolean RASP Recognize Exactly the Star-Free Languages
HTML
PDF
Dana Angluin, David Chiang, Andy Yang
TL;DR
Transformer编码器通过硬注意力和严格的未来掩码来识别语言,其所识别的语言类是无星自由语言;添加位置嵌入可以扩展所识别的语言类到其他研究领域。通过布尔RASP技术,我们将Transformer与一阶逻辑、时间逻辑和代数自动机理论联系起来。
Abstract
We consider
transformer encoders
with
hard attention
(in which all attention is focused on exactly one position) and
strict future masking
→