π€ AI Summary
MRI motion artifacts severely degrade diagnostic image quality, yet existing methods struggle to simultaneously suppress artifacts and preserve anatomical fidelity. To address this, we propose a dual-domain implicit neural representation (INR) reconstruction framework that jointly models pixel-space and k-space domains. Our approach introduces a novel low-frequency k-space-guided texture prior modeling mechanism and synergistically integrates high-frequency phase information with pixel-domain features to recover fine anatomical structures. Furthermore, we incorporate complementary binary masks and a dynamic weighted loss to enable globalβlocal adaptive artifact suppression. Evaluated on the NYU fastMRI dataset, our method achieves statistically significant improvements in PSNR and SSIM over state-of-the-art methods, while markedly enhancing image sharpness, anatomical clarity, and perceptual realism. This work establishes a new paradigm for clinical MRI accelerated reconstruction.
π Abstract
Magnetic resonance imaging (MRI) motion artifacts can seriously affect clinical diagnostics, making it challenging to interpret images accurately. Existing methods for eliminating motion artifacts struggle to retain fine structural details and simultaneously lack the necessary vividness and sharpness. In this study, we present a novel dual-domain optimization (DDO) approach that integrates information from the pixel and frequency domains guiding the recovery of clean magnetic resonance images through implicit neural representations(INRs). Specifically, our approach leverages the low-frequency components in the k-space as a reference to capture accurate tissue textures, while high-frequency and pixel information contribute to recover details. Furthermore, we design complementary masks and dynamic loss weighting transitioning from global to local attention that effectively suppress artifacts while retaining useful details for reconstruction. Experimental results on the NYU fastMRI dataset demonstrate that our method outperforms existing approaches in multiple evaluation metrics. Our code is available at https://anonymous.4open.science/r/DDO-IN-A73B.