hyperparameter optimization (HPO) is a powerful technique for automating the
tuning of machine learning (ML) models. However, in many real-world
applications, accuracy is only one of multiple performance criteria that must
be considered. Optimizing these objectives simultaneously on a
DEHB is a new Hyperparameter Optimization method that combines the advantages of Hyperband and Differential Evolution, and achieves strong performance far more robustly on a broad range of HPO problems and tabular benchmarks, especially for high-dimensional problems with discrete input dimensions.