LightCtrl: Training-free Controllable Video Relighting

📅 2026-03-27
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing video relighting methods struggle to enable explicit control over the output illumination. This work proposes a training-free, controllable video relighting approach that dynamically modulates lighting changes according to user-specified light trajectories. By integrating a pretrained image relighting model with video diffusion priors, the method introduces a light map injection module and a geometry-aware relighting module. It further employs normal map fusion in the frequency domain and a latent-space noise sampling strategy to simultaneously ensure temporal consistency and significantly enhance adherence to the prescribed illumination trajectory. Experiments demonstrate that the proposed method generates high-quality videos with diverse and precisely trajectory-aligned lighting, achieving markedly superior controllability compared to existing baselines.
📝 Abstract
Recent diffusion models have achieved remarkable success in image relighting, and this success has quickly been extended to video relighting. However, existing methods offer limited explicit control over illumination in the relighted output. We present LightCtrl, the first controllable video relighting method that enables explicit control of video illumination through a user-supplied light trajectory in a training-free manner. Our approach combines pre-trained diffusion models: an image relighting model processes each frame individually, followed by a video diffusion prior to enhance temporal consistency. To achieve explicit control over dynamically varying lighting, we introduce two key components. First, a Light Map Injection module samples light trajectory-specific noise and injects it into the latent representation of the source video, improving illumination coherence with the conditional light trajectory. Second, a Geometry-Aware Relighting module dynamically combines RGB and normal map latents in the frequency domain to suppress the influence of the original lighting, further enhancing adherence to the input light trajectory. Experiments show that LightCtrl produces high-quality videos with diverse illumination changes that closely follow the specified light trajectory, demonstrating improved controllability over baseline methods. Code is available at: https://github.com/GVCLab/LightCtrl.
Problem

Research questions and friction points this paper is trying to address.

video relighting
illumination control
light trajectory
controllable lighting
temporal consistency
Innovation

Methods, ideas, or system contributions that make the work stand out.

controllable video relighting
training-free
light trajectory
diffusion models
geometry-aware relighting