Background: deep neural networks have proven to be powerful computational
tools for modeling, prediction, and generation. However, the workings of these
models have generally been opaque. Recent work has shown that the performance
of some models are modulated by overlapping functional