Adversarial diffusion distillation
A Sauer, D Lorenz, A Blattmann… - arXiv preprint arXiv …, 2023 - arxiv.org
We introduce Adversarial Diffusion Distillation (ADD), a novel training approach that
efficiently samples large-scale foundational image diffusion models in just 1-4 steps while …
efficiently samples large-scale foundational image diffusion models in just 1-4 steps while …
Score identity distillation: Exponentially fast distillation of pretrained diffusion models for one-step generation
We introduce Score identity Distillation (SiD), an innovative data-free method that distills the
generative capabilities of pretrained diffusion models into a single-step generator. SiD not …
generative capabilities of pretrained diffusion models into a single-step generator. SiD not …
Your Student is Better Than Expected: Adaptive Teacher-Student Collaboration for Text-Conditional Diffusion Models
N Starodubcev, D Baranchuk… - Proceedings of the …, 2024 - openaccess.thecvf.com
Abstract Knowledge distillation methods have recently shown to be a promising direction to
speedup the synthesis of large-scale diffusion models by requiring only a few inference …
speedup the synthesis of large-scale diffusion models by requiring only a few inference …
Score-based Diffusion Models via Stochastic Differential Equations--a Technical Tutorial
This is an expository article on the score-based diffusion models, with a particular focus on
the formulation via stochastic differential equations (SDE). After a gentle introduction, we …
the formulation via stochastic differential equations (SDE). After a gentle introduction, we …
Towards a mathematical theory for consistency training in diffusion models
Consistency models, which were proposed to mitigate the high computational overhead
during the sampling phase of diffusion models, facilitate single-step sampling while attaining …
during the sampling phase of diffusion models, facilitate single-step sampling while attaining …
Exploring the frontiers of softmax: Provable optimization, applications in diffusion model, and beyond
The softmax activation function plays a crucial role in the success of large language models
(LLMs), particularly in the self-attention mechanism of the widely adopted Transformer …
(LLMs), particularly in the self-attention mechanism of the widely adopted Transformer …
CoDi: Conditional Diffusion Distillation for Higher-Fidelity and Faster Image Generation
Large generative diffusion models have revolutionized text-to-image generation and offer
immense potential for conditional generation tasks such as image enhancement restoration …
immense potential for conditional generation tasks such as image enhancement restoration …
Residual Learning in Diffusion Models
Diffusion models (DMs) have achieved remarkable generative performance particularly with
the introduction of stochastic differential equations (SDEs). Nevertheless a gap emerges in …
the introduction of stochastic differential equations (SDEs). Nevertheless a gap emerges in …
Hyper-sd: Trajectory segmented consistency model for efficient image synthesis
Recently, a series of diffusion-aware distillation algorithms have emerged to alleviate the
computational overhead associated with the multi-step inference process of Diffusion …
computational overhead associated with the multi-step inference process of Diffusion …
Distilling Diffusion Models into Conditional GANs
We propose a method to distill a complex multistep diffusion model into a single-step
conditional GAN student model, dramatically accelerating inference, while preserving image …
conditional GAN student model, dramatically accelerating inference, while preserving image …