BriefGPT.xyz
Oct, 2018
深度神经网络优化的进化随机梯度下降
Evolutionary Stochastic Gradient Descent for Optimization of Deep Neural Networks
HTML
PDF
Xiaodong Cui, Wei Zhang, Zoltán Tüske, Michael Picheny
TL;DR
该研究提出了一种基于人口的进化随机梯度下降(ESGD)框架来优化深度神经网络,该框架将SGD和基于梯度的进化算法作为互补算法,以提高种群的平均适应度。
Abstract
We propose a population-based
evolutionary stochastic gradient descent
(ESGD) framework for optimizing
deep neural networks
. ESGD combines SGD and gradient-free evolutionary algorithms as complementary algorithms
→