Memorize Early, Then Query: Inlier-Memorization-Guided Active Outlier Detection

πŸ“… 2026-01-16
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
This work addresses the limitations of existing unsupervised anomaly detection methods when the boundary between normal and anomalous samples is ambiguous or when anomalies form dense clusters. To this end, the authors propose IMBoost, a novel framework that uniquely integrates memory-based modeling of normal data with active learning. In the first stage, the model strengthens its memorization of normal samples; in the second stage, it actively selects the most informative samples through a new query strategy and a tailored loss function to maximize the score gap between normal and anomalous instances. Theoretical analysis demonstrates that IMBoost consistently enlarges the risk discrepancy between the two classes. Extensive experiments show that IMBoost significantly outperforms current active anomaly detection approaches across multiple benchmark datasets while substantially reducing computational overhead.

Technology Category

Application Category

πŸ“ Abstract
Outlier detection (OD) aims to identify abnormal instances, known as outliers or anomalies, by learning typical patterns of normal data, or inliers. Performing OD under an unsupervised regime-without any information about anomalous instances in the training data-is challenging. A recently observed phenomenon, known as the inlier-memorization (IM) effect, where deep generative models (DGMs) tend to memorize inlier patterns during early training, provides a promising signal for distinguishing outliers. However, existing unsupervised approaches that rely solely on the IM effect still struggle when inliers and outliers are not well-separated or when outliers form dense clusters. To address these limitations, we incorporate active learning to selectively acquire informative labels, and propose IMBoost, a novel framework that explicitly reinforces the IM effect to improve outlier detection. Our method consists of two stages: 1) a warm-up phase that induces and promotes the IM effect, and 2) a polarization phase in which actively queried samples are used to maximize the discrepancy between inlier and outlier scores. In particular, we propose a novel query strategy and tailored loss function in the polarization phase to effectively identify informative samples and fully leverage the limited labeling budget. We provide a theoretical analysis showing that the IMBoost consistently decreases inlier risk while increasing outlier risk throughout training, thereby amplifying their separation. Extensive experiments on diverse benchmark datasets demonstrate that IMBoost not only significantly outperforms state-of-the-art active OD methods but also requires substantially less computational cost.
Problem

Research questions and friction points this paper is trying to address.

outlier detection
inlier-memorization
active learning
unsupervised learning
anomaly detection
Innovation

Methods, ideas, or system contributions that make the work stand out.

inlier-memorization
active learning
outlier detection
deep generative models
query strategy
πŸ”Ž Similar Papers
No similar papers found.
M
Minseo Kang
Department of Statistics, Sungshin Women’s University
S
Seunghwan Park
Information Statistics, Kangwon National University
Dongha Kim
Dongha Kim
Arizona State University