BriefGPT.xyz
Feb, 2021
关于ReLU网络的最优逼近速率及其宽度和深度的影响
Optimal Approximation Rate of ReLU Networks in terms of Width and Depth
HTML
PDF
Zuowei Shen, Haizhao Yang, Shijun Zhang
TL;DR
研究如何使用深层前馈神经网络以最优近似方式处理Holder连续函数和Lipschitz连续函数,并验证ReLU网络在宽度和深度上的优越性,同时得出近似速率达到最优的结论。
Abstract
This paper concentrates on the approximation power of
deep feed-forward neural networks
in terms of width and depth. It is proved by construction that
relu networks
with width $\mathcal{O}\big(\max\{d\lfloor N^{1
→