🤖 AI Summary
Traditional Fourier ptychography (FP) relies on illumination-angle scanning or camera displacement to introduce measurement diversity, resulting in system complexity and high cost. This paper proposes inverse synthetic aperture Fourier ptychography (ISAP-FP), the first framework to replace hardware scanning with target self-motion—enabling frequency-domain diversity acquisition under fixed illumination and fixed camera configurations. Its core innovations include: (i) a dual-plane intensity acquisition protocol, and (ii) a deep learning-based k-space coordinate estimation algorithm that operates without prior motion information, directly recovering frequency-domain sampling positions from low-resolution far-field image sequences and jointly optimizing phase retrieval. Both simulations and experiments demonstrate that ISAP-FP achieves high-resolution, wide-field-of-view imaging while significantly reducing hardware complexity and cost—thereby breaking the conventional synthetic aperture paradigm’s dependence on mechanical actuation.
📝 Abstract
Fourier ptychography (FP) is a powerful light-based synthetic aperture imaging technique that allows one to reconstruct a high-resolution, wide field-of-view image by computationally integrating a diverse collection of low-resolution, far-field measurements. Typically, FP measurement diversity is introduced by changing the angle of the illumination or the position of the camera; either approach results in sampling different portions of the target's spatial frequency content, but both approaches introduce substantial costs and complexity to the acquisition process. In this work, we introduce Inverse Synthetic Aperture Fourier Ptychography, a novel approach to FP that foregoes changing the illumination angle or camera position and instead generates measurement diversity through target motion. Critically, we also introduce a novel learning-based method for estimating k-space coordinates from dual plane intensity measurements, thereby enabling synthetic aperture imaging without knowing the rotation of the target. We experimentally validate our method in simulation and on a tabletop optical system.