Verification of Visual Controllers via Compositional Geometric Transformations

📅 2025-07-06
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing formal verification methods for vision-based controllers predominantly rely on pixel-wise $L_p$-norm perturbations, failing to capture low-dimensional structured uncertainties arising from geometric transformations—such as rotation, scaling, and translation—in realistic scenarios. This work proposes the first formal verification framework tailored to geometric transformations: it establishes an explicit, differentiable rendering map from system states to observed images; models geometric uncertainty via transformation groups; and abstracts the state space to enable precise characterization of perception uncertainty directly in image space. Leveraging forward reachability analysis, the framework computes over-approximations of closed-loop system behaviors, yielding theoretically sound safety guarantees. Experiments on multiple control benchmarks demonstrate substantial improvements in coverage and verification accuracy against realistic visual disturbances, overcoming fundamental limitations of conventional pixel-level analyses.

Technology Category

Application Category

📝 Abstract
Perception-based neural network controllers are increasingly used in autonomous systems that rely on visual inputs to operate in the real world. Ensuring the safety of such systems under uncertainty is challenging. Existing verification techniques typically focus on Lp-bounded perturbations in the pixel space, which fails to capture the low-dimensional structure of many real-world effects. In this work, we introduce a novel verification framework for perception-based controllers that can generate outer-approximations of reachable sets through explicitly modeling uncertain observations with geometric perturbations. Our approach constructs a boundable mapping from states to images, enabling the use of state-based verification tools while accounting for uncertainty in perception. We provide theoretical guarantees on the soundness of our method and demonstrate its effectiveness across benchmark control environments. This work provides a principled framework for certifying the safety of perception-driven control systems under realistic visual perturbations.
Problem

Research questions and friction points this paper is trying to address.

Verify safety of neural visual controllers under uncertainty
Address real-world geometric perturbations in perception
Provide sound reachability analysis for perception-driven systems
Innovation

Methods, ideas, or system contributions that make the work stand out.

Compositional geometric transformations for verification
Boundable state-to-image mapping with uncertainty
Outer-approximations of reachable sets
🔎 Similar Papers
No similar papers found.
A
Alexander Estornell
Northeastern University, Boston, MA, USA
L
Leonard Jung
Northeastern University, Boston, MA, USA
Michael Everett
Michael Everett
Assistant Professor, Northeastern University
RoboticsLearningControl TheorySafety