BriefGPT.xyz
May, 2024
Adv-KD: 教师对抗知识蒸馏以加速扩散采样
Adv-KD: Adversarial Knowledge Distillation for Faster Diffusion Sampling
HTML
PDF
Kidist Amde Mekonnen, Nicola Dall'Asen, Paolo Rota
TL;DR
通过将去噪步骤直接整合到模型的架构中,本研究提出了一种新方法,将扩散模型与生成对抗网络结合起来,通过知识蒸馏实现更高效的训练和评估,从而减少了所需的参数和去噪步骤,提高了测试时的采样速度。
Abstract
diffusion probabilistic models
(DPMs) have emerged as a powerful class of
deep generative models
, achieving remarkable performance in image synthesis tasks. However, these models face challenges in terms of wides
→