ProtoDepth: Unsupervised Continual Depth Completion with Prototypes

📅 2025-03-17
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses catastrophic forgetting in unsupervised continual depth completion. To tackle the challenges of unknown test-time domain identities and non-stationary data distributions, we propose a prototype-driven continual learning framework that freezes a pre-trained backbone and dynamically selects an adaptive prototype set via learnable domain descriptors—enabling unsupervised cross-domain alignment and fine-tuning of latent-space features. Crucially, the method avoids weight updates entirely, thereby preserving memory stability. It integrates prototype learning, domain descriptor modeling, unsupervised depth completion loss, and cross-domain feature alignment. Evaluated on standard continual benchmark sequences, our approach reduces forgetting by 52.2% (indoor) and 53.2% (outdoor) compared to prior methods, achieving state-of-the-art performance.

Technology Category

Application Category

📝 Abstract
We present ProtoDepth, a novel prototype-based approach for continual learning of unsupervised depth completion, the multimodal 3D reconstruction task of predicting dense depth maps from RGB images and sparse point clouds. The unsupervised learning paradigm is well-suited for continual learning, as ground truth is not needed. However, when training on new non-stationary distributions, depth completion models will catastrophically forget previously learned information. We address forgetting by learning prototype sets that adapt the latent features of a frozen pretrained model to new domains. Since the original weights are not modified, ProtoDepth does not forget when test-time domain identity is known. To extend ProtoDepth to the challenging setting where the test-time domain identity is withheld, we propose to learn domain descriptors that enable the model to select the appropriate prototype set for inference. We evaluate ProtoDepth on benchmark dataset sequences, where we reduce forgetting compared to baselines by 52.2% for indoor and 53.2% for outdoor to achieve the state of the art.
Problem

Research questions and friction points this paper is trying to address.

Unsupervised continual depth completion from RGB and sparse point clouds.
Preventing catastrophic forgetting in non-stationary data distributions.
Adapting to new domains without modifying original model weights.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Prototype-based continual depth completion learning
Adapts latent features without modifying original weights
Domain descriptors for prototype set selection
🔎 Similar Papers
No similar papers found.