Optimizing Perturbations for Improved Training of Machine Learning Models

📅 2025-02-06
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Machine learning training is significantly more time-consuming than inference, and the design of input or parameter perturbations has long relied on empirical trial-and-error. Method: This paper models training dynamics as a first-passage process and introduces a statistical mechanics framework to analyze model responses to input/parameter perturbations. It proposes, for the first time, a single-frequency perturbation response theory grounded in the quasi-stationary assumption, and rigorously proves its generalizability to multi-frequency perturbation regimes—enabling rational optimization of perturbation protocols. Contribution/Results: Evaluated on ResNet-18 trained for CIFAR-10 classification, the method precisely identifies the optimal perturbation type and frequency, reducing training iterations by 23% and improving test accuracy by 1.4 percentage points, thereby substantially enhancing both training efficiency and generalization performance.

Technology Category

Application Category

📝 Abstract
Machine learning models have become indispensable tools in applications across the physical sciences. Their training is often time-consuming, vastly exceeding the inference timescales. Several protocols have been developed to perturb the learning process and improve the training, such as shrink and perturb, warm restarts, and stochastic resetting. For classifiers, these perturbations have been shown to result in enhanced speedups or improved generalization. However, the design of such perturbations is usually done extit{ad hoc} by intuition and trial and error. To rationally optimize training protocols, we frame them as first-passage processes and consider their response to perturbations. We show that if the unperturbed learning process reaches a quasi-steady state, the response at a single perturbation frequency can predict the behavior at a wide range of frequencies. We demonstrate that this is the case when training a CIFAR-10 classifier using the ResNet-18 model and use this approach to identify an optimal perturbation and frequency. Our work allows optimization of training protocols of machine learning models using a statistical mechanical approach.
Problem

Research questions and friction points this paper is trying to address.

Optimize perturbations for machine learning training.
Predict behavior across perturbation frequencies efficiently.
Enhance training speed and model generalization.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Optimizing perturbations via first-passage processes
Predicting behavior at multiple frequencies
Statistical mechanics for training protocol optimization
🔎 Similar Papers
No similar papers found.
Sagi Meir
Sagi Meir
PhD candidate, department of chemical physics, Tel Aviv University
Tommer D. Keidar
Tommer D. Keidar
PhD Student, Tel Aviv University
stochastic resettingfirst-passage processesbiophysics
S
S. Reuveni
School of Chemistry, Tel Aviv University, Tel Aviv 6997801, Israel; The Center for Physics and Chemistry of Living Systems, Tel Aviv University, Tel Aviv 6997801, Israel; The Center for Computational Molecular and Materials Science, Tel Aviv University, Tel Aviv 6997801, Israel
Barak Hirshberg
Barak Hirshberg
Assistant Professor of Chemistry, Tel Aviv University
Molecular DynamicsPath IntegralsEnhanced SamplingTheoretical ChemistryPhysical Chemistry