We propose convolutional block attention module (CBAM), a simple yet
effective attention module for feed-forward convolutional neural networks.
Given an intermediate feature map, our module sequentially infers attention
maps along two separate dimensions, channel and spatial, then the