BriefGPT.xyz
Oct, 2016
深度ReLU网络的逼近误差界
Error bounds for approximations with deep ReLU networks
HTML
PDF
Dmitry Yarotsky
TL;DR
研究一维Lipschitz函数的逼近中,深层ReLU网络比浅层网络更有效地逼近光滑函数,采用自适应深度6网络体系结构比标准浅层网络更有效。
Abstract
We study how approximation errors of
neural networks
with
relu
activation functions depend on the depth of the network. We establish rigorous error bounds showing that deep
→