BriefGPT.xyz
Dec, 2023
基于优化紧束的ReLU网络的计算权衡
Computational Tradeoffs of Optimization-Based Bound Tightening in ReLU Networks
HTML
PDF
Fabian Badilla, Marcos Goycoolea, Gonzalo Muñoz, Thiago Serra
TL;DR
利用混合整数线性规划(MILP)模型来表示带有修正线性单元激活函数的神经网络的使用已在过去十年中越来越普遍。本研究探讨了这些边界的紧密性与求解结果MILP模型的计算付出之间的权衡,并提供了实施这些模型的指南,基于网络结构、正则化和四舍五入的影响。
Abstract
The use of
mixed-integer linear programming
(MILP) models to represent
neural networks
with Rectified Linear Unit (ReLU) activations has become increasingly widespread in the last decade. This has enabled the use
→