BriefGPT.xyz
Oct, 2020
过参数神经网络优化算法的动力学视角
A Dynamical View on Optimization Algorithms of Overparameterized Neural Networks
HTML
PDF
Zhiqi Bu, Shiyun Xu, Kan Chen
TL;DR
本研究通过分析神经网络与算法优化之间的关系,探讨了近期许多工作都关注的神经网络损失动态问题,证明了在ReLU激活函数下,NAG算法可能只是以次线性的速度达到全局最小值,结果表明优化非凸性损失函数实际是在对预测误差进行优化最优化问题。
Abstract
When equipped with efficient
optimization algorithms
, the over-parameterized
neural networks
have demonstrated high level of performance even though the loss function is non-convex and non-smooth. While many work
→