BriefGPT.xyz
Jul, 2023
深度网络近似:超越ReLU,使用多样激活函数
Deep Network Approximation: Beyond ReLU to Diverse Activation Functions
HTML
PDF
Shijun Zhang, Jianfeng Lu, Hongkai Zhao
TL;DR
本文研究深度神经网络对各种激活函数的表达能力,并证明可在任意有界集合上以稍大的常数精度近似任意激活函数的神经网络。
Abstract
This paper explores the
expressive power
of
deep neural networks
for a diverse range of
activation functions
. An activation function set $
→