BriefGPT.xyz
Jun, 2024
关于近似ReLU神经网络参数的增长
On the growth of the parameters of approximating ReLU neural networks
HTML
PDF
Erion Morina, Martin Holler
TL;DR
对于具有最先进的逼近误差的ReLU结构,本研究的主要结果是其实现参数的增长至多是多项式的,与现有结果相比,在大多数情况下,特别是对于高维输入,该增长率优于现有结果。
Abstract
This work focuses on the analysis of
fully connected feed forward
relu neural networks
as they approximate a given, smooth function. In contrast to conventionally studied universal
→