BriefGPT.xyz
Nov, 2016
使用修正线性单元理解深度神经网络
Understanding Deep Neural Networks with Rectified Linear Units
HTML
PDF
Raman Arora, Amitabh Basu, Poorya Mianjy, Anirbit Mukherjee
TL;DR
本文研究使用带有ReLU的深度神经网络能够代表的函数家族,提供了一个训练一个ReLU深度神经网络的一种算法,同时提高了在将ReLU神经网络函数逼近为浅层ReLU网络时已知下限的上界,并证明了这些间隙定理。
Abstract
In this paper we investigate the family of functions representable by
deep neural networks
(DNN) with rectified linear units (
relu
). We give the first-ever polynomial time (in the size of data) algorithm to train
→