BriefGPT.xyz
May, 2023
ReLU缓解NTK条件数并加速宽神经网络的优化
ReLU soothes the NTK condition number and accelerates optimization for wide neural networks
HTML
PDF
Chaoyue Liu, Like Hui
TL;DR
本文研究了修正线性单元(ReLU)在神经网络中的作用,证明了ReLU能够提高神经网络的数据分离效果和提升神经切向核的条件数,进而改善了学习的收敛速率。
Abstract
rectified linear unit
(ReLU), as a non-linear
activation function
, is well known to improve the expressivity of
neural networks
such that
→