BriefGPT.xyz
Jun, 2024
神经算子的连续关注
Continuum Attention for Neural Operators
HTML
PDF
Edoardo Calvello, Nikola B. Kovachki, Matthew E. Levine, Andrew M. Stuart
TL;DR
用注意机制来设计神经操作器,在函数空间中进行Transformers的研究,证明其作为实践中的Monte Carlo或有限差分近似算符,同时介绍了函数空间泛化的patching策略和相关神经操作器的类,证明其在注意力函数空间表述和神经操作器中的应用的潜力。
Abstract
transformers
, and the
attention mechanism
in particular, have become ubiquitous in machine learning. Their success in modeling nonlocal, long-range correlations has led to their widespread adoption in natural lan
→