GitXplorerGitXplorer
o

consistency_models

public
6073 stars
409 forks
47 issues

Commits

List of commits on branch main.
Verified
1339a69b0bb6bd88bf52ef380e501d06eeddbf10

[docs] update to reflect correct cmd to download dataset

bbrandonJY committed a year ago
Unverified
f6963427f40489fea76d9d70546fd6564fc7b4e9

update README; add docker installation

ddiscus0434 committed a year ago
Unverified
771ccca0f65bea9a2a4bfed8869b21fb333abfea

update

ddiscus0434 committed a year ago
Verified
593d4e389aed9b25469914d573c43e88d98223e0

Update setup.py

aashutosh1919 committed a year ago
Verified
5f9fb5a7851cc448957e70be4b09d9ff2d163a21

fix typo in unet.py

eeltociear committed a year ago
Verified
6d26080c58244555c031dbc63080c0961af74200

Merge pull request #1 from tmgthb/main

yyang-song committed a year ago

README

The README file for this repository.

Consistency Models

This repository contains the codebase for Consistency Models, implemented using PyTorch for conducting large-scale experiments on ImageNet-64, LSUN Bedroom-256, and LSUN Cat-256. We have based our repository on openai/guided-diffusion, which was initially released under the MIT license. Our modifications have enabled support for consistency distillation, consistency training, as well as several sampling and editing algorithms discussed in the paper.

The repository for CIFAR-10 experiments is in JAX and can be found at openai/consistency_models_cifar10.

Pre-trained models

We have released checkpoints for the main models in the paper. Before using these models, please review the corresponding model card to understand the intended use and limitations of these models.

Here are the download links for each model checkpoint:

Dependencies

To install all packages in this codebase along with their dependencies, run

pip install -e .

To install with Docker, run the following commands:

cd docker && make build && make run

Model training and sampling

We provide examples of EDM training, consistency distillation, consistency training, single-step generation, and multistep generation in scripts/launch.sh.

Evaluations

To compare different generative models, we use FID, Precision, Recall, and Inception Score. These metrics can all be calculated using batches of samples stored in .npz (numpy) files. One can evaluate samples with cm/evaluations/evaluator.py in the same way as described in openai/guided-diffusion, with reference dataset batches provided therein.

Use in 🧨 diffusers

Consistency models are supported in 🧨 diffusers via the ConsistencyModelPipeline class. Below we provide an example:

import torch

from diffusers import ConsistencyModelPipeline

device = "cuda"
# Load the cd_imagenet64_l2 checkpoint.
model_id_or_path = "openai/diffusers-cd_imagenet64_l2"
pipe = ConsistencyModelPipeline.from_pretrained(model_id_or_path, torch_dtype=torch.float16)
pipe.to(device)

# Onestep Sampling
image = pipe(num_inference_steps=1).images[0]
image.save("consistency_model_onestep_sample.png")

# Onestep sampling, class-conditional image generation
# ImageNet-64 class label 145 corresponds to king penguins

class_id = 145
class_id = torch.tensor(class_id, dtype=torch.long)

image = pipe(num_inference_steps=1, class_labels=class_id).images[0]
image.save("consistency_model_onestep_sample_penguin.png")

# Multistep sampling, class-conditional image generation
# Timesteps can be explicitly specified; the particular timesteps below are from the original Github repo.
# https://github.com/openai/consistency_models/blob/main/scripts/launch.sh#L77
image = pipe(timesteps=[22, 0], class_labels=class_id).images[0]
image.save("consistency_model_multistep_sample_penguin.png")

You can further speed up the inference process by using torch.compile() on pipe.unet (only supported from PyTorch 2.0). For more details, please check out the official documentation. This support was contributed to 🧨 diffusers by dg845 and ayushtues.

Citation

If you find this method and/or code useful, please consider citing

@article{song2023consistency,
  title={Consistency Models},
  author={Song, Yang and Dhariwal, Prafulla and Chen, Mark and Sutskever, Ilya},
  journal={arXiv preprint arXiv:2303.01469},
  year={2023},
}