BriefGPT.xyz
Feb, 2022
两层ReLU网络的快速凸优化:等价模型类和锥分解
Fast Convex Optimization for Two-Layer ReLU Networks: Equivalent Model Classes and Cone Decompositions
HTML
PDF
Aaron Mishkin, Arda Sahiner, Mert Pilanci
TL;DR
本文研究了基于ReLU激活函数的两层神经网络的凸优化及其群lasso正则化和加速近端梯度算法,该方法在MNIST和CIFAR-10数据集的图像分类方面表现良好。
Abstract
We develop fast algorithms and robust software for
convex optimization
of two-layer
neural networks
with
relu activation functions
. Our wo
→