BriefGPT.xyz
Mar, 2023
可微凸优化器的参数化一阶元学习
Meta-Learning Parameterized First-Order Optimizers using Differentiable Convex Optimization
HTML
PDF
Tanmay Gautam, Samuel Pfrommer, Somayeh Sojoudi
TL;DR
通过提出一个元学习框架,在内循环优化步骤中使用可微分凸优化(DCO),我们概括了一个广泛的现有更新规则家族,并演示了可一步优化一系列线性最小二乘问题的理论吸引力。
Abstract
Conventional optimization methods in
machine learning
and
controls
rely heavily on first-order update rules. Selecting the right method and hyperparameters for a particular task often involves trial-and-error or
→