ATOM-CBF: Adaptive Safe Perception-Based Control under Out-of-Distribution Measurements

πŸ“… 2025-11-11
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
Learning-based perception modules pose safety risks under out-of-distribution (OoD) inputs due to epistemic uncertainty. Existing safety frameworks typically require ground-truth labels or prior knowledge of distribution shiftsβ€”often unavailable in real-world deployment. Method: We propose a label-free, distribution-agnostic adaptive safety control framework that explicitly models OoD perception uncertainty and embeds it into Control Barrier Functions (CBFs). Our approach jointly optimizes data-driven perception models with a safety layer, integrating online adaptive error-bound estimation and a safety-filtering mechanism to enable tunable conservatism and real-time guarantees. Contribution/Results: To our knowledge, this is the first work to incorporate explicit OoD uncertainty modeling directly into CBF synthesis. Evaluated on F1Tenth LiDAR navigation and quadrupedal robot RGB-vision control tasks, our method significantly improves safety and robustness under OoD conditions. It establishes a verifiable, deployable safety paradigm for autonomous systems relying on learned perception.

Technology Category

Application Category

πŸ“ Abstract
Ensuring the safety of real-world systems is challenging, especially when they rely on learned perception modules to infer the system state from high-dimensional sensor data. These perception modules are vulnerable to epistemic uncertainty, often failing when encountering out-of-distribution (OoD) measurements not seen during training. To address this gap, we introduce ATOM-CBF (Adaptive-To-OoD-Measurement Control Barrier Function), a novel safe control framework that explicitly computes and adapts to the epistemic uncertainty from OoD measurements, without the need for ground-truth labels or information on distribution shifts. Our approach features two key components: (1) an OoD-aware adaptive perception error margin and (2) a safety filter that integrates this adaptive error margin, enabling the filter to adjust its conservatism in real-time. We provide empirical validation in simulations, demonstrating that ATOM-CBF maintains safety for an F1Tenth vehicle with LiDAR scans and a quadruped robot with RGB images.
Problem

Research questions and friction points this paper is trying to address.

Addresses safety challenges in systems using learned perception modules
Mitigates epistemic uncertainty from out-of-distribution sensor measurements
Provides adaptive safety control without requiring ground-truth labels
Innovation

Methods, ideas, or system contributions that make the work stand out.

Adaptive safe control for out-of-distribution measurements
OoD-aware perception error margin for real-time adaptation
Safety filter integrating adaptive error margin
πŸ”Ž Similar Papers