Generative Refocusing: Flexible Defocus Control from a Single Image

📅 2025-12-18
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing single-image refocusing methods rely on all-in-focus inputs, synthetic data, and exhibit limited aperture control. This work proposes the first end-to-end depth-of-field refocusing framework capable of handling arbitrarily blurred inputs, enabling simultaneous all-in-focus restoration and controllable, photorealistic bokeh synthesis. Methodologically, we introduce a semi-supervised training paradigm that jointly leverages synthetic paired data and unpaired real-world bokeh images; incorporate EXIF metadata-driven, physics-aware modeling to explicitly encode lens optical properties; and integrate three core modules—DeblurNet (for deblurring), BokehNet (for parametric bokeh generation), and a text-guided diffusion-based control module—to support continuous aperture adjustment and non-circular aperture blur. Our approach achieves state-of-the-art performance across three major benchmarks—deblurring, bokeh synthesis, and refocusing—demonstrating significant improvements in real-scenario applicability and editing flexibility.

Technology Category

Application Category

📝 Abstract
Depth-of-field control is essential in photography, but getting the perfect focus often takes several tries or special equipment. Single-image refocusing is still difficult. It involves recovering sharp content and creating realistic bokeh. Current methods have significant drawbacks. They need all-in-focus inputs, depend on synthetic data from simulators, and have limited control over aperture. We introduce Generative Refocusing, a two-step process that uses DeblurNet to recover all-in-focus images from various inputs and BokehNet for creating controllable bokeh. Our main innovation is semi-supervised training. This method combines synthetic paired data with unpaired real bokeh images, using EXIF metadata to capture real optical characteristics beyond what simulators can provide. Our experiments show we achieve top performance in defocus deblurring, bokeh synthesis, and refocusing benchmarks. Additionally, our Generative Refocusing allows text-guided adjustments and custom aperture shapes.
Problem

Research questions and friction points this paper is trying to address.

Recovering all-in-focus images from defocused inputs
Generating realistic and controllable bokeh effects
Enabling flexible refocusing from a single image
Innovation

Methods, ideas, or system contributions that make the work stand out.

Two-step process with DeblurNet and BokehNet
Semi-supervised training with real EXIF metadata
Text-guided adjustments and custom aperture shapes
🔎 Similar Papers
No similar papers found.