BriefGPT.xyz
Sep, 2020
ReLU网络在核区域中的深浅等效性
Deep Equals Shallow for ReLU Networks in Kernel Regimes
HTML
PDF
Alberto Bietti, Francis Bach
TL;DR
本文研究深度全连接网络从可近似角度看与其两层浅神经网络等价,表明其泛化能力在某些方面受限于核函数框架,提出一种基于核函数的特征值分析方法。
Abstract
deep networks
are often considered to be more expressive than shallow ones in terms of
approximation
. Indeed, certain functions can be approximated by
→