BriefGPT.xyz
Nov, 2023
ReLU网络在凸松弛下的表达能力
Expressivity of ReLU-Networks under Convex Relaxations
HTML
PDF
Maximilian Baader, Mark Niklas Müller, Yuhao Mao, Martin Vechev
TL;DR
通过对常用凸松弛方法进行深入研究,我们发现:(i)更高级的松弛方法允许更多单变量函数被精确分析的ReLU网络表达,(ii)更精确的松弛方法能够允许指数级规模的解空间编码相同函数的ReLU网络,以及(iii)即使使用最精确的单神经元松弛方法,也无法构建能够精确分析多变量凸、单调的分段线性函数的ReLU网络。
Abstract
convex relaxations
are a key component of training and certifying provably safe
neural networks
. However, despite substantial progress, a wide and poorly understood accuracy gap to standard networks remains, rais
→