BriefGPT.xyz
Feb, 2021
KŁ几何下的近端梯度下降-上升算法:可变收敛
Proximal Gradient Descent-Ascent: Variable Convergence under KŁ Geometry
HTML
PDF
Ziyi Chen, Yi Zhou, Tengyu Xu, Yingbin Liang
TL;DR
本文研究了一种更为广泛的两人博弈非凸强凹优化的收敛性,在K-L参数化几何全谱上,证明了Proximal-GDA算法的收敛速率是不同的,这是首个关于非凸极小极大优化变量收敛的理论结果。
Abstract
The gradient descent-ascent (GDA) algorithm has been widely applied to solve
minimax optimization
problems. In order to achieve convergent policy parameters for
minimax optimization
, it is important that GDA gene
→