May, 2023
锐度感知最小化导致低秩特征
Sharpness-Aware Minimization Leads to Low-Rank Features
Maksym Andriushchenko, Dara Bahri, Hossein Mobahi, Nicolas Flammarion
TL;DRSharpness-aware minimization (SAM) method can reduce feature ranks in various types of neural networks, and the phenomenon is observed in a simple two-layer network. A significant number of activations gets pruned by SAM, which contributes directly to this rank reduction. The observed low-rank effect can also occur in deep networks, although the overall mechanism can be more complex.