Focal Split: Untethered Snapshot Depth from Differential Defocus

📅 2025-04-15
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the limitations of handheld snapshot depth cameras—namely, their reliance on external power and computation, and poor reproducibility—we propose a fully onboard, passive, snapshot-based differential defocus depth imaging system (DfDD). Our method employs an achromatic dual-path optical design to simultaneously capture defocused images, coupled with a lightweight embedded algorithm requiring only 500 FLOPs per pixel, enabling real-time single-frame depth and confidence estimation on a Raspberry Pi 5. The prototype consumes 4.9 W, achieving >10 hours of battery life; it operates within a 0.4–1.2 m range and outputs sparse 480×360 depth maps at 2.1 FPS. To our knowledge, this is the first end-to-end depth sensing solution that requires no active illumination, no external computing hardware, and tightly integrates optics with algorithmic processing. We provide complete open-source hardware schematics, firmware, and assembly documentation—significantly enhancing portability, energy efficiency, and experimental reproducibility.

Technology Category

Application Category

📝 Abstract
We introduce Focal Split, a handheld, snapshot depth camera with fully onboard power and computing based on depth-from-differential-defocus (DfDD). Focal Split is passive, avoiding power consumption of light sources. Its achromatic optical system simultaneously forms two differentially defocused images of the scene, which can be independently captured using two photosensors in a snapshot. The data processing is based on the DfDD theory, which efficiently computes a depth and a confidence value for each pixel with only 500 floating point operations (FLOPs) per pixel from the camera measurements. We demonstrate a Focal Split prototype, which comprises a handheld custom camera system connected to a Raspberry Pi 5 for real-time data processing. The system consumes 4.9 W and is powered on a 5 V, 10,000 mAh battery. The prototype can measure objects with distances from 0.4 m to 1.2 m, outputting 480$ imes$360 sparse depth maps at 2.1 frames per second (FPS) using unoptimized Python scripts. Focal Split is DIY friendly. A comprehensive guide to building your own Focal Split depth camera, code, and additional data can be found at https://focal-split.qiguo.org.
Problem

Research questions and friction points this paper is trying to address.

Handheld snapshot depth camera using differential defocus
Passive system with onboard power and computing
Real-time depth mapping with low computational cost
Innovation

Methods, ideas, or system contributions that make the work stand out.

Handheld snapshot depth camera with onboard power
Passive achromatic optical system for dual images
Efficient DfDD theory with 500 FLOPs per pixel
🔎 Similar Papers
No similar papers found.
Junjie Luo
Junjie Luo
Zhejiang A&F University
Urban digital twinVisual perceptionUAV modelling
J
John Mamish
College of Computing, Georgia Institute of Technology
A
Alan Fu
Elmore Family School of Electrical and Computer Engineering, Purdue University
T
Thomas Concannon
Elmore Family School of Electrical and Computer Engineering, Purdue University
Josiah Hester
Josiah Hester
Associate Professor, Interactive Computing and Computer Science, Georgia Tech
Ubiquitous ComputingBattery-Free SystemsWearablesIntermittent Computing
Emma Alexander
Emma Alexander
Northwestern University
Computer Vision
Q
Qi Guo
Elmore Family School of Electrical and Computer Engineering, Purdue University