BriefGPT.xyz
Jul, 2018
极大极小优化问题中 (乐观) 梯度下降的极限点
The Limit Points of (Optimistic) Gradient Descent in Min-Max Optimization
HTML
PDF
Constantinos Daskalakis, Ioannis Panageas
TL;DR
研究第一阶段方法在极小极大问题中的收敛属性,证明了基本的GD和OGD方法可以避免不稳定的临界点,并在初始状态下几乎所有的点都是OGDA稳定的临界点,而OGDA稳定的临界点集合是包含GDA稳定的临界点的超集,这些动态的行为可以从动态系统的角度进行研究。
Abstract
Motivated by applications in Optimization, Game Theory, and the training of Generative Adversarial Networks, the convergence properties of
first order methods
in
min-max problems
have received extensive study. It
→