🤖 AI Summary
Existing Kernel Inducing Points (KIP) methods are constrained by the squared loss, limiting their applicability to classification tasks that commonly employ non-quadratic convex losses such as cross-entropy or hinge loss. To address this limitation, we propose Dual-Gap Kernel Inducing Points (DGKIP), the first dataset distillation framework to incorporate convex optimization duality gap theory. DGKIP lifts the restriction on loss function form, enabling compatibility with arbitrary convex losses. Our method establishes a theoretical bound linking test error and prediction consistency, thereby connecting parameter perturbation bounds to kernel approximation quality. Experiments on MNIST and CIFAR-10 demonstrate that DGKIP preserves KIP’s computational efficiency while significantly broadening loss-function compatibility. Distilled models achieve stable accuracy and superior generalization performance across diverse convex losses—particularly cross-entropy—without sacrificing scalability or fidelity.
📝 Abstract
We propose Duality Gap KIP (DGKIP), an extension of the Kernel Inducing Points (KIP) method for dataset distillation. While existing dataset distillation methods often rely on bi-level optimization, DGKIP eliminates the need for such optimization by leveraging duality theory in convex programming. The KIP method has been introduced as a way to avoid bi-level optimization; however, it is limited to the squared loss and does not support other loss functions (e.g., cross-entropy or hinge loss) that are more suitable for classification tasks. DGKIP addresses this limitation by exploiting an upper bound on parameter changes after dataset distillation using the duality gap, enabling its application to a wider range of loss functions. We also characterize theoretical properties of DGKIP by providing upper bounds on the test error and prediction consistency after dataset distillation. Experimental results on standard benchmarks such as MNIST and CIFAR-10 demonstrate that DGKIP retains the efficiency of KIP while offering broader applicability and robust performance.