BriefGPT.xyz
Apr, 2023
使用深层 ReLU 网络近似非线性泛函
Approximation of Nonlinear Functionals Using Deep ReLU Networks
HTML
PDF
Linhao Song, Jun Fan, Di-Rong Chen, Ding-Xuan Zhou
TL;DR
本文研究了与ReLU激活函数相关的功能深度神经网络的逼近能力,并在简单三角剖分下构建了连续分段线性插值。此外,还建立了所提出的功能深度ReLU网络的逼近速率,并在温和的正则条件下进行了分析,最终探究了功能数据学习算法的理解。
Abstract
In recent years,
functional neural networks
have been proposed and studied in order to approximate
nonlinear continuous functionals
defined on $L^p([-1, 1]^s)$ for integers $s\ge1$ and $1\le p<\infty$. However, t
→