Markov decision processes (MDP) are a well-established model for Sequential Decision-Making in the presence of probabilities. In robust MDP (RMDP), every action is associated with an uncertainty set of probability distributions, modelling that transition probabilities are not known pre