Temporal Score Rescaling for Temperature Sampling in Diffusion and Flow Models

📅 2025-10-01
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of flexibly controlling sampling diversity in pretrained denoising diffusion and flow-matching models—without fine-tuning or altering training procedures. We propose Temporal Score Rescaling (TSR), a plug-and-play mechanism that dynamically rescales the score function during sampling, effectively modulating local temperature to continuously adjust the sharpness (i.e., diversity) of the generated distribution. TSR is compatible with both deterministic and stochastic samplers and requires no architectural or training modifications. We validate TSR across five diverse tasks—image generation, pose estimation, depth prediction, robotic manipulation, and protein design—demonstrating consistent performance gains solely by tuning the rescaling parameter. Results show significant improvements in downstream task metrics, underscoring TSR’s strong generality, practicality, and ease of deployment across arbitrary pretrained diffusion or flow-matching models.

Technology Category

Application Category

📝 Abstract
We present a mechanism to steer the sampling diversity of denoising diffusion and flow matching models, allowing users to sample from a sharper or broader distribution than the training distribution. We build on the observation that these models leverage (learned) score functions of noisy data distributions for sampling and show that rescaling these allows one to effectively control a `local'sampling temperature. Notably, this approach does not require any finetuning or alterations to training strategy, and can be applied to any off-the-shelf model and is compatible with both deterministic and stochastic samplers. We first validate our framework on toy 2D data, and then demonstrate its application for diffusion models trained across five disparate tasks -- image generation, pose estimation, depth prediction, robot manipulation, and protein design. We find that across these tasks, our approach allows sampling from sharper (or flatter) distributions, yielding performance gains e.g., depth prediction models benefit from sampling more likely depth estimates, whereas image generation models perform better when sampling a slightly flatter distribution. Project page: https://temporalscorerescaling.github.io
Problem

Research questions and friction points this paper is trying to address.

Steering sampling diversity in diffusion models
Controlling local temperature without retraining
Applying rescaling across five disparate tasks
Innovation

Methods, ideas, or system contributions that make the work stand out.

Rescaling score functions to control sampling temperature
No finetuning required for off-the-shelf models
Applicable to both deterministic and stochastic samplers
🔎 Similar Papers
No similar papers found.