BriefGPT.xyz
Jun, 2019
组合转码:利用注意力发现更具普适性的解决方案
Transcoding compositionally: using attention to find more generalizable solutions
HTML
PDF
Kris Korrel, Dieuwke Hupkes, Verna Dankers, Elia Bruni
TL;DR
本文介绍了seq2attn,这是一种新的架构,旨在利用注意力来发现输入中的组合模式。这篇论文证明了seq2attn可以成功地推广应用于挑战神经网络组合技能的两项任务。
Abstract
While
sequence-to-sequence models
have shown remarkable
generalization
power across several natural language tasks, their construct of solutions are argued to be less compositional than human-like
→