This repository contains the official implementation of the paper: Learning to Integrate Diffusion ODEs by Averaging the Derivatives.

Code: https://github.com/poppuppy/secant_expectation

About

This work introduces an intermediate strategy for accelerating diffusion model inference by learning ODE integration. It uses loss functions derived from the derivative-integral relationship, inspired by Monte Carlo integration and Picard iteration. Referred to as "secant losses," this approach balances performance and cost, leading to great training stability. The secant version of EDM achieves a 10-step FID of 2.14 on CIFAR-10, while the secant version of SiT-XL/2 attains a 4-step FID of 2.27 and an 8-step FID of 1.96 on ImageNet-256x256.

Installation

For installation instructions and how to train and evaluate the models, please refer to the GitHub repository.

Checkpoints

Trained checkpoints for different settings (SDEI, STEE, SDEE, STEE, various step counts) are provided on Hugging Face.

Citation

If you find this work useful in your research, please consider citing:

@article{liu2025learning,
  title={Learning to Integrate Diffusion ODEs by Averaging the Derivatives},
  author={Liu, Wenze and Yue, Xiangyu},
  journal={arXiv preprint arXiv:2505.14502},
  year={2025}
}
Downloads last month
-
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support