🤖 AI Summary
In high-energy physics, detector calibration and simulation have long been decoupled, relying heavily on Gaussian assumptions and hand-crafted prior distributions, limiting accuracy and physical interpretability. Method: This paper proposes a unified framework based on conditional normalizing flows (cNF), the first to jointly perform generative detector simulation and parametric energy inference within a maximum-likelihood estimation paradigm—without assuming response distribution forms or incorporating physics-based priors. Contribution/Results: The method directly extracts non-Gaussian energy resolution from likelihood curvature. Evaluated in an ATLAS-style calorimeter simulation environment, it accurately reproduces realistic non-Gaussian response distributions; energy calibration error is reduced by 23% compared to conventional regression methods, significantly improving both calibration precision and physical interpretability.
📝 Abstract
There have been many applications of deep neural networks to detector calibrations and a growing number of studies that propose deep generative models as automated fast detector simulators. We show that these two tasks can be unified by using maximum likelihood estimation (MLE) from conditional generative models for energy regression. Unlike direct regression techniques, the MLE approach is prior independent and non-Gaussian resolutions can be determined from the shape of the likelihood near the maximum. Using an ATLAS-like calorimeter simulation, we demonstrate this concept in the context of calorimeter energy calibration.
Published by the American Physical Society
2025