Depth Jitter: Seeing through the Depth

📅 2025-08-08
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Traditional data augmentation methods neglect depth awareness, resulting in insufficient robustness of models under realistic depth variations. To address this, we propose Depth-Jitter—the first depth-aware augmentation technique that adaptively modulates perturbations based on local depth variance. Specifically, it computes per-pixel depth variance from depth maps and generates structure-preserving, depth-aligned jitter perturbations to explicitly model depth uncertainty. Depth-Jitter requires no additional annotations and is rigorously validated across diverse encoders, learning rates, and loss functions on FathomNet and UTDAC2020. It significantly improves generalization and stability in depth-sensitive applications—including underwater imaging, robotic navigation, and autonomous driving. While not universally superior to baselines under standard conditions, Depth-Jitter demonstrates marked performance gains in scenarios with complex depth distributions or severe imaging distortions. The implementation is publicly available.

Technology Category

Application Category

📝 Abstract
Depth information is essential in computer vision, particularly in underwater imaging, robotics, and autonomous navigation. However, conventional augmentation techniques overlook depth aware transformations, limiting model robustness in real world depth variations. In this paper, we introduce Depth-Jitter, a novel depth-based augmentation technique that simulates natural depth variations to improve generalization. Our approach applies adaptive depth offsetting, guided by depth variance thresholds, to generate synthetic depth perturbations while preserving structural integrity. We evaluate Depth-Jitter on two benchmark datasets, FathomNet and UTDAC2020 demonstrating its impact on model stability under diverse depth conditions. Extensive experiments compare Depth-Jitter against traditional augmentation strategies such as ColorJitter, analyzing performance across varying learning rates, encoders, and loss functions. While Depth-Jitter does not always outperform conventional methods in absolute performance, it consistently enhances model stability and generalization in depth-sensitive environments. These findings highlight the potential of depth-aware augmentation for real-world applications and provide a foundation for further research into depth-based learning strategies. The proposed technique is publicly available to support advancements in depth-aware augmentation. The code is publicly available on href{https://github.com/mim-team/Depth-Jitter}{github}.
Problem

Research questions and friction points this paper is trying to address.

Enhancing model robustness in real-world depth variations
Improving generalization with depth-based augmentation techniques
Addressing limitations of conventional depth-aware transformations
Innovation

Methods, ideas, or system contributions that make the work stand out.

Depth-Jitter simulates natural depth variations
Adaptive depth offsetting preserves structural integrity
Enhances model stability in depth-sensitive environments
🔎 Similar Papers
No similar papers found.