Test-time Uncertainty Estimation for Medical Image Registration via Transformation Equivariance

📅 2025-09-27
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In medical image registration, existing deep learning models lack reliable test-time uncertainty estimation, and mainstream approaches require architectural modifications or retraining—rendering them incompatible with already-deployed pretrained models. To address this, we propose a plug-and-play, architecture- and training-free framework for test-time uncertainty estimation. Leveraging the equivariance of registration transformations, our method induces predictive variance via spatial input perturbations and, for the first time, decomposes perturbation-induced uncertainty into intrinsic discretization uncertainty and bias jitter—yielding pixel-wise uncertainty maps highly correlated with registration error. The framework is model-agnostic and compatible with any pretrained registration network. We validate its effectiveness across diverse anatomical regions (brain, heart, abdomen, lung) and multiple state-of-the-art models. Results demonstrate substantial improvements in risk awareness and deployment safety for clinical applications and large-scale studies.

Technology Category

Application Category

📝 Abstract
Accurate image registration is essential for downstream applications, yet current deep registration networks provide limited indications of whether and when their predictions are reliable. Existing uncertainty estimation strategies, such as Bayesian methods, ensembles, or MC dropout, require architectural changes or retraining, limiting their applicability to pretrained registration networks. Instead, we propose a test-time uncertainty estimation framework that is compatible with any pretrained networks. Our framework is grounded in the transformation equivariance property of registration, which states that the true mapping between two images should remain consistent under spatial perturbations of the input. By analyzing the variance of network predictions under such perturbations, we derive a theoretical decomposition of perturbation-based uncertainty in registration. This decomposition separates into two terms: (i) an intrinsic spread, reflecting epistemic noise, and (ii) a bias jitter, capturing how systematic error drifts under perturbations. Across four anatomical structures (brain, cardiac, abdominal, and lung) and multiple registration models (uniGradICON, SynthMorph), the uncertainty maps correlate consistently with registration errors and highlight regions requiring caution. Our framework turns any pretrained registration network into a risk-aware tool at test time, placing medical image registration one step closer to safe deployment in clinical and large-scale research settings.
Problem

Research questions and friction points this paper is trying to address.

Estimating uncertainty in medical image registration without retraining models
Using transformation equivariance to evaluate prediction reliability at test time
Separating epistemic noise from systematic error in registration predictions
Innovation

Methods, ideas, or system contributions that make the work stand out.

Leverages transformation equivariance for uncertainty estimation
Analyzes prediction variance under spatial input perturbations
Decomposes uncertainty into intrinsic spread and bias jitter
🔎 Similar Papers
No similar papers found.
L
Lin Tian
Massachusetts General Hospital and Harvard Medical School
X
Xiaoling Hu
Massachusetts General Hospital and Harvard Medical School
Juan Eugenio Iglesias
Juan Eugenio Iglesias
Massachusetts General Hospital & Harvard Medical School / MIT / UCL
Medical Image Analysis