🤖 AI Summary
This work addresses the challenges of computationally expensive voxel-wise Bayesian inference in whole-body dynamic PET imaging and the limited generalizability of deep learning approaches. The authors propose a likelihood-free, model-agnostic approximate Bayesian inference framework that replaces conventional likelihood evaluation with forward kinetic simulations and discrepancy metrics. By fully vectorizing the computation, the method enables efficient GPU parallelization, circumventing both MCMC sampling and neural network training. This approach achieves, for the first time, training-free, tracer- and model-agnostic voxel-level uncertainty quantification. Evaluated on both simulated and real whole-body [18F]FDG PET data, it efficiently produces high-quality Ki parametric maps, yielding posterior estimates superior to non-negative least squares while better preserving spatial correlations and sensitivity to activation detection.
📝 Abstract
Dynamic PET kinetic modeling increasingly demands voxelwise uncertainty quantification and robust model selection. Yet total-body PET (TB-PET) data volumes make conventional Bayesian approaches, such as per-voxel MCMC, computationally impractical, while deep models typically require retraining and careful revalidation when tracers, protocols, or kinetic models change, without necessarily improving inference speed.
Vectorized voxelwise approximate Bayesian computation (vPET-ABC) is introduced as a likelihood-free, model-agnostic posterior inference framework for dynamic PET kinetic modeling at total-body scale. The method replaces explicit likelihood evaluation with forward simulations and a discrepancy test, then exploits full vectorization to transform voxelwise inference into an embarrassingly parallel workload suited to modern GPUs.
In simulation, vPET-ABC produced posterior summaries with small divergence from sequential Monte Carlo baselines, and posterior mean estimates significantly more accurate than non-negative least squares (NNLS). For model selection between the linear parametric neurotransmitter model (lp-ntPET) and the multilinear reference tissue model, vPET-ABC maintained high sensitivity under high noise with moderate loss of specificity, whereas NNLS+Bayesian information criteria exhibited the opposite trade-off with near-zero sensitivity. In a human cigarette smoking dataset, vPET-ABC yielded denser probabilistic activation maps than lp-ntPET with effective number of parameters. On a 50 min total-body [18F]FDG study, vPET-ABC generated high quality whole volume K_i parametric images within practical runtimes on a single GPU, while also preserved local spatial correlation better than NNLS.
Overall, vPET-ABC delivers fast, training-free, uncertainty-aware inference that scales to TB-PET and remains portable across tracers and kinetic models.