BriefGPT.xyz
Sep, 2017
神经网络的表达能力:基于宽度的视角
The Expressive Power of Neural Networks: A View from the Width
HTML
PDF
Zhou Lu, Hongming Pu, Feicheng Wang, Zhiqiang Hu, Liwei Wang
TL;DR
本文研究神经网络的宽度对其表达能力的影响,证明了width-$(n+4)$ ReLU神经网络是一种通用逼近器,同时存在一些无法用宽度为$n$的神经网络进行逼近的函数,表现出相变现象,结果展示了深度对ReLU网络的表达能力比宽度更为有效。
Abstract
The expressive power of
neural networks
is important for understanding deep learning. Most existing works consider this problem from the view of the
depth
of a network. In this paper, we study how
→