gradient boosting of prediction rules is an efficient approach to learn
potentially interpretable yet accurate probabilistic models. However, actual
interpretability requires to limit the number and size of the g
Wasserstein gradient boosting is a novel ensemble method that uses Wasserstein gradient to approximate a target probability distribution and produce a distributional estimate of the output-distribution parameter, outperforming existing methods in probabilistic prediction.