TL;DR证明深度神经网络可以有效逼近多元多项式,但当只有一个隐藏层时,所需的神经元数量呈指数级增长;另一方面,增加隐藏层数量从 1 到 k 时,所需的神经元数量的增长速度是随着 n^(1/k) 对数增长,暗示了实用的表达所需的最小层数仅对 n 进行对数级增长。
Abstract
It is well-known that neural networks are universal approximators, but that
deeper networks tend in practice to be more powerful than shallower ones. We
shed light on this by proving that the total number of neurons $m$ required to
approximate natural classes of multivariate polynomial