Factored Levenberg-Marquardt for Diffeomorphic Image Registration: An efficient optimizer for FireANTs

📅 2026-03-19
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the high memory consumption of FireANTs in large-scale image registration caused by the Adam optimizer by proposing a lightweight variant of the Levenberg-Marquardt (LM) optimizer. The proposed method maintains only a single scalar damping parameter, adaptively adjusted via a trust-region strategy, and incorporates a Metropolis-Hastings–style rejection mechanism to prevent loss deterioration. Integrating this LM optimizer into the FireANTs framework with GPU-accelerated Eulerian descent substantially reduces memory usage—by up to 24.6%—while matching or exceeding Adam’s performance on three out of four benchmark datasets. Notably, hyperparameters optimized on brain MRI data generalize effectively to lung CT and abdominal cross-modality registration tasks without further tuning, demonstrating strong cross-domain transferability.

Technology Category

Application Category

📝 Abstract
FireANTs introduced a novel Eulerian descent method for plug-and-play behavior with arbitrary optimizers adapted for diffeomorphic image registration as a test-time optimization problem, with a GPU-accelerated implementation. FireANTs uses Adam as its default optimizer for fast and more robust optimization. However, Adam requires storing state variables (i.e. momentum and squared-momentum estimates), each of which can consume significant memory, prohibiting its use for significantly large images. In this work, we propose a modified Levenberg-Marquardt (LM) optimizer that requires only a single scalar damping parameter as optimizer state, that is adaptively tuned using a trust region approach. The resulting optimizer reduces memory by up to 24.6% for large volumes, and retaining performance across all four datasets. A single hyperparameter configuration tuned on brain MRI transfers without modification to lung CT and cross-modal abdominal registration, matching or outperforming Adam on three of four benchmarks. We also perform ablations on the effectiveness of using Metropolis-Hastings style rejection step to prevent updates that worsen the loss function.
Problem

Research questions and friction points this paper is trying to address.

diffeomorphic image registration
memory-efficient optimization
large-scale image registration
optimizer state
GPU-accelerated registration
Innovation

Methods, ideas, or system contributions that make the work stand out.

Levenberg-Marquardt
diffeomorphic image registration
memory-efficient optimization
trust region
FireANTs
🔎 Similar Papers
No similar papers found.
Rohit Jena
Rohit Jena
PhD in CS, University of Pennsylvania
Reinforcement LearningPost-trainingAI for healthcareDistributed Optimization
Pratik Chaudhari
Pratik Chaudhari
University of Pennsylvania
Deep LearningMachine LearningRobotics
J
James C. Gee
Department of Radiology & Penn Image Computing and Science Laboratory (PICSL), Perelman School of Medicine, University of Pennsylvania, Philadelphia, PA 19104, USA