May, 2024

贝叶斯核心集质量的一般界限

TL;DRBayesian coresets can speed up posterior inference by approximating the full-data log-likelihood function with a surrogate log-likelihood based on a small, weighted subset of the data. This paper provides general upper and lower bounds on the Kullback-Leibler divergence of coreset approximations, applicable in a wide range of models, and demonstrates the theory's flexibility in validation experiments involving multimodal, unidentifiable, heavy-tailed Bayesian posterior distributions.