CogGen: Cognitive-Load-Informed Fully Unsupervised Deep Generative Modeling for Compressively Sampled MRI Reconstruction

📅 2026-02-20
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of ill-posed inverse problems in compressed sensing MRI (CS-MRI) reconstruction under limited training data or computational resources, where conventional unsupervised generative models often suffer from slow convergence and overfitting to noise. To mitigate this, the authors propose a cognitive-load-guided self-paced curriculum learning framework that formulates reconstruction as a staged inversion process: it initially focuses on low-frequency, high signal-to-noise ratio k-space data and progressively incorporates high-frequency or noise-dominated measurements. Learning difficulty is dynamically modulated through a dual mechanism of soft weighting and hard selection. By integrating Deep Image Prior (DIP) and Implicit Neural Representations (INR), the framework yields two variants—CogGen-DIP and CogGen-INR—that significantly enhance reconstruction fidelity and convergence speed in unsupervised settings, achieving performance comparable to certain supervised methods.

Technology Category

Application Category

📝 Abstract
Fully unsupervised deep generative modeling (FU-DGM) is promising for compressively sampled MRI (CS-MRI) when training data or compute are limited. Classical FU-DGMs such as DIP and INR rely on architectural priors, but the ill-conditioned inverse problem often demands many iterations and easily overfits measurement noise. We propose CogGen, a cognitive-load-informed FU-DGM that casts CS-MRI as staged inversion and regulates task-side"cognitive load"by progressively scheduling intrinsic difficulty and extraneous interference. CogGen replaces uniform data fitting with an easy-to-hard k-space weighting/selection strategy: early iterations emphasize low-frequency, high-SNR, structure-dominant samples, while higher-frequency or noise-dominated measurements are introduced later. We realize this schedule through self-paced curriculum learning (SPCL) with complementary criteria: a student mode that reflects what the model can currently learn and a teacher mode that indicates what it should follow, supporting both soft weighting and hard selection. Experiments and analyses show that CogGen-DIP and CogGen-INR improve reconstruction fidelity and convergence behavior compared with strong unsupervised baselines and competitive supervised pipelines.
Problem

Research questions and friction points this paper is trying to address.

compressive sensing MRI
unsupervised deep generative modeling
inverse problem
overfitting
reconstruction fidelity
Innovation

Methods, ideas, or system contributions that make the work stand out.

cognitive-load-informed
fully unsupervised deep generative modeling
self-paced curriculum learning
k-space weighting
compressive MRI reconstruction
🔎 Similar Papers