Diffusion Reconstruction-based Data Likelihood Estimation for Core-Set Selection

📅 2025-11-24
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing core-set selection methods predominantly rely on heuristic scoring (e.g., training dynamics, uncertainty) without explicit modeling of data likelihood, limiting their ability to capture essential distributional structure. Method: This paper introduces, for the first time, the partial denoising reconstruction bias of diffusion models into core-set selection. We theoretically establish that this bias is negatively correlated with the log-likelihood of data and rigorously link it to the evidence lower bound (ELBO). Furthermore, we propose an information-entropy-driven, timestep-adaptive selection mechanism to enable distribution-aware, precise data scoring. Results: Evaluated on ImageNet, our method achieves full-dataset performance using only 50% of the data—outperforming state-of-the-art approaches by a significant margin—while demonstrating robustness and stability across varying subset sizes.

Technology Category

Application Category

📝 Abstract
Existing core-set selection methods predominantly rely on heuristic scoring signals such as training dynamics or model uncertainty, lacking explicit modeling of data likelihood. This omission may hinder the constructed subset from capturing subtle yet critical distributional structures that underpin effective model training. In this work, we propose a novel, theoretically grounded approach that leverages diffusion models to estimate data likelihood via reconstruction deviation induced by partial reverse denoising. Specifically, we establish a formal connection between reconstruction error and data likelihood, grounded in the Evidence Lower Bound (ELBO) of Markovian diffusion processes, thereby enabling a principled, distribution-aware scoring criterion for data selection. Complementarily, we introduce an efficient information-theoretic method to identify the optimal reconstruction timestep, ensuring that the deviation provides a reliable signal indicative of underlying data likelihood. Extensive experiments on ImageNet demonstrate that reconstruction deviation offers an effective scoring criterion, consistently outperforming existing baselines across selection ratios, and closely matching full-data training using only 50% of the data. Further analysis shows that the likelihood-informed nature of our score reveals informative insights in data selection, shedding light on the interplay between data distributional characteristics and model learning preferences.
Problem

Research questions and friction points this paper is trying to address.

Estimating data likelihood via diffusion reconstruction for core-set selection
Addressing lack of explicit data modeling in existing selection methods
Developing distribution-aware scoring using reconstruction deviation and ELBO
Innovation

Methods, ideas, or system contributions that make the work stand out.

Leverages diffusion models for data likelihood estimation
Connects reconstruction error to data likelihood via ELBO
Uses information-theoretic method for optimal reconstruction timestep
🔎 Similar Papers
No similar papers found.
Mingyang Chen
Mingyang Chen
Baichuan Inc., Zhejiang University, The University of Edinburgh
Large Language ModelReinforcement LearningKnowledge Graph
Jiawei Du
Jiawei Du
National Taiwan University; ex-Intern @ Samsung Research
Speech processingNeural codingGenerative AIAI security
B
Bo Huang
The Hong Kong University of Science and Technology (Guangzhou)
Y
Yi Wang
Dongguan University of Technology
X
Xiaobo Zhang
Southwest Jiaotong University
W
Wei Wang
The Hong Kong University of Science and Technology (Guangzhou)