🤖 AI Summary
Deformable medical image registration is critical for precision radiotherapy and tumor follow-up, yet existing AI-based methods often yield anatomically implausible deformations and exhibit poor generalizability. This paper proposes a keypoint-driven implicit registration framework that formulates displacement field estimation as a sparse keypoint signal reconstruction problem under a learnable implicit kernel function. We design a coarse-to-fine hierarchical architecture and enable test-time interactive deformation refinement. By integrating implicit neural representations with learnable kernel regression, our approach bridges the generalization gap between implicit and explicit registration paradigms. Evaluated on zero-shot longitudinal thoracoabdominal registration—where no patient-specific training data is available—our method achieves state-of-the-art accuracy, significantly improves anatomical consistency of deformations, matches the performance of commercial systems, and is validated across multi-center clinical datasets.
📝 Abstract
Deformable medical image registration is an essential task in computer-assisted interventions. This problem is particularly relevant to oncological treatments, where precise image alignment is necessary for tracking tumor growth, assessing treatment response, and ensuring accurate delivery of therapies. Recent AI methods can outperform traditional techniques in accuracy and speed, yet they often produce unreliable deformations that limit their clinical adoption. In this work, we address this challenge and introduce a novel implicit registration framework that can predict accurate and reliable deformations. Our insight is to reformulate image registration as a signal reconstruction problem: we learn a kernel function that can recover the dense displacement field from sparse keypoint correspondences. We integrate our method in a novel hierarchical architecture, and estimate the displacement field in a coarse-to-fine manner. Our formulation also allows for efficient refinement at test time, permitting clinicians to easily adjust registrations when needed. We validate our method on challenging intra-patient thoracic and abdominal zero-shot registration tasks, using public and internal datasets from the local University Hospital. Our method not only shows competitive accuracy to state-of-the-art approaches, but also bridges the generalization gap between implicit and explicit registration techniques. In particular, our method generates deformations that better preserve anatomical relationships and matches the performance of specialized commercial systems, underscoring its potential for clinical adoption.