Skip to content

phykn/diffusion_models_tutorial

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Diffusion Models Tutorial

Diffusion models in machine learning are a type of probabilistic generative model. Diffusion models are gaining attention due to their capacity to generate highly realistic images. It is also recognized for its exceptional performance in various fields such as text-to-image conversion, which converts text into images, image inpainting that replaces missing parts in an image, and super-resolution that enhances image quality. If you are interested in experimenting with diffusion models, you can try it out at https://stablediffusionweb.com

To comprehend diffusion models, one must familiarize themselves with several complex equations. Even I struggled with it, and honestly, I am still learning. Therefore, in this tutorial page, I aim to organize what I have studied in a more accessible manner. I hope it will be beneficial to those studying diffusion models.

This tutorial is divided into several parts.

  1. Part 1 provides a summary of the background knowledge necessary for working with diffusion models. It is beneficial for those who require prior knowledge of concepts such as expected values and ELBO to refer to.
  2. Part 2 covers the fundamentals of diffusion models, including the forward and reverse process concepts. Additionally, we will implement diffusion models based on what we have learned so far in this section.
  3. Diffusion models are known for their ability to generate high-quality images, but they require hundreds to thousands of samples, resulting in time-consuming problems. To tackle this issue, several methods have been proposed. Part 3 of this tutorial focuses on DDIM, which was the initial approach proposed to address these problems.

In this tutorial, we will also introduce a more advanced sampling method.

Table of contents

Title LINK Update
PART 1. Background
1 Expectation and variance LINK 23.05.03
2 Reparameterization trick LINK 23.05.03
3 Kullback–Leibler divergence LINK 23.05.03
4 Evidence lower bound LINK 23.05.03
PART 2. Diffusion Models
1 Introduction LINK 23.05.04
2 Forward process LINK 23.05.04
3 Reverse process LINK 22.12.14
4 Noise schedule LINK 22.12.14
5 Example: DDPM LINK 22.12.19
PART 3. DDIM
1 DDIM LINK 22.12.23
2 Example: DDIM LINK 22.12.23

References

  1. Expected value (Wikipedia)
  2. Variance (Wikipedia)
  3. Jensen's inequality (Wikipedia)
  4. Kullback–Leibler divergence (Wikipedia)
  5. Bayes' theorem (Wikipedia)
  6. 공돌이의 수학정리노트
  7. Matthew N. Bernstein
  8. Generative Modeling by Estimating Gradients of the Data Distribution (Yang Song, 2021)
  9. What are Diffusion Models? (Lilian Weng, 2021)
  10. Diffusion models explained. How does OpenAI's GLIDE work? ( AI Coffee Break with Letitia, 2022)
  11. How does Stable Diffusion work? – Latent Diffusion Models EXPLAINED ( AI Coffee Break with Letitia, 2022)
  12. Diffusion Models | Paper Explanation | Math Explained (Outlier, 2022)
  13. DDPM - Diffusion Models Beat GANs on Image Synthesis (Machine Learning Research Paper Explained) (Yannic Kilcher, 2021)
  14. Diffusion models from scratch in PyTorch (DeepFindr, 2022)
  15. Denoising Diffusion Probabilistic Models (Jonathan Ho, Ajay Jain, Pieter Abbeel, 2020)
  16. Diffusion Models Beat GANs on Image Synthesis (Prafulla Dhariwal, Alex Nichol, 2021)
  17. Improved Denoising Diffusion Probabilistic Models (Alex Nichol, Prafulla Dhariwal, 2021)
  18. Diffusion Models: A Comprehensive Survey of Methods and Applications (Ling Yang, et al., 2022)
  19. Diffusion Models in Vision: A Survey (Florinel-Alin Croitoru, et al., 2022)
  20. Diffusion Models: A Comprehensive Survey of Methods and Applications (Ling Yang, et al., 2022)
  21. The Annotated Diffusion Model
  22. Conditional Diffusion MNIST

About

Tutorial on diffusion models

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published