BriefGPT.xyz
Jun, 2017
一层隐藏层神经网络的恢复保证
Recovery Guarantees for One-hidden-layer Neural Networks
HTML
PDF
Kai Zhong, Zhao Song, Prateek Jain, Peter L. Bartlett, Inderjit S. Dhillon
TL;DR
本文考虑使用单隐藏层神经网络模型去解决回归问题,探讨了激活函数的属性,比如ReLU、leaky ReLU、sigmoid等都满足局部强凸性。文中还提出了使用张量方法对参数进行初始化,并配合梯度下降算法使用来解决回归问题。最终达到了使用线性的输入维数和对数精度计算复杂度的样本复杂性和计算复杂性要求。
Abstract
In this paper, we consider regression problems with one-hidden-layer
neural networks
(1NNs). We distill some properties of
activation functions
that lead to $\mathit{local~strong~convexity}$ in the neighborhood o
→