DATS: Distance-Aware Temperature Scaling for Calibrated Class-Incremental Learning

📅 2025-09-25
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In class-incremental learning (CIL), inconsistent cross-task uncertainty calibration and susceptibility to catastrophic forgetting hinder reliable model deployment. To address this, we propose Distance-Aware Temperature Scaling (DATS), a novel calibration framework that dynamically couples task-distance estimation with temperature scaling—departing from conventional centralized calibration using a shared, static temperature parameter. DATS computes inter-task distances based on class prototypes, enabling adaptive inference of task proximity without requiring task identity at test time, thereby facilitating task-specific temperature adjustment. Evaluated on standard CIL benchmarks and real-world imbalanced biomedical datasets, DATS significantly reduces cross-task Expected Calibration Error (ECE) while improving both calibration stability and accuracy over state-of-the-art methods. By unifying task-aware uncertainty modeling with incremental learning, DATS establishes a new paradigm for trustworthy and safe-critical continual learning.

Technology Category

Application Category

📝 Abstract
Continual Learning (CL) is recently gaining increasing attention for its ability to enable a single model to learn incrementally from a sequence of new classes. In this scenario, it is important to keep consistent predictive performance across all the classes and prevent the so-called Catastrophic Forgetting (CF). However, in safety-critical applications, predictive performance alone is insufficient. Predictive models should also be able to reliably communicate their uncertainty in a calibrated manner - that is, with confidence scores aligned to the true frequencies of target events. Existing approaches in CL address calibration primarily from a data-centric perspective, relying on a single temperature shared across all tasks. Such solutions overlook task-specific differences, leading to large fluctuations in calibration error across tasks. For this reason, we argue that a more principled approach should adapt the temperature according to the distance to the current task. However, the unavailability of the task information at test time/during deployment poses a major challenge to achieve the intended objective. For this, we propose Distance-Aware Temperature Scaling (DATS), which combines prototype-based distance estimation with distance-aware calibration to infer task proximity and assign adaptive temperatures without prior task information. Through extensive empirical evaluation on both standard benchmarks and real-world, imbalanced datasets taken from the biomedical domain, our approach demonstrates to be stable, reliable and consistent in reducing calibration error across tasks compared to state-of-the-art approaches.
Problem

Research questions and friction points this paper is trying to address.

Addresses calibration issues in continual learning models across tasks
Overcomes limitations of single-temperature scaling ignoring task differences
Enables adaptive temperature scaling without task information during deployment
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses prototype-based distance estimation for task proximity
Applies distance-aware calibration without task information
Assigns adaptive temperatures based on task distance
🔎 Similar Papers
No similar papers found.
G
Giuseppe Serra
Goethe University Frankfurt, German Cancer Consortium (DKTK)
Florian Buettner
Florian Buettner
Frankfurt University/DKFZ