DSP-Reg: Domain-Sensitive Parameter Regularization for Robust Domain Generalization

📅 2026-01-27
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses a critical limitation in existing domain generalization methods, which predominantly emphasize feature-level invariance while overlooking parameter-level sensitivity to domain shifts. To bridge this gap, the paper introduces a novel covariance-based mechanism to quantify parameter sensitivity by analyzing the statistical properties of parameter gradients across domains, thereby enabling precise identification of parameters that are either robust or sensitive to domain shifts. Building on this insight, the authors devise a soft regularization strategy that steers the model to prioritize domain-invariant parameters and suppress domain-specific ones, achieving fine-grained control over the learning process. The proposed approach attains an average accuracy of 66.7% across standard benchmarks—including PACS, VLCS, OfficeHome, and DomainNet—significantly outperforming current state-of-the-art methods.

Technology Category

Application Category

📝 Abstract
Domain Generalization (DG) is a critical area that focuses on developing models capable of performing well on data from unseen distributions, which is essential for real-world applications. Existing approaches primarily concentrate on learning domain-invariant features, which assume that a model robust to variations in the source domains will generalize well to unseen target domains. However, these approaches neglect a deeper analysis at the parameter level, which makes the model hard to explicitly differentiate between parameters sensitive to domain shifts and those robust, potentially hindering its overall ability to generalize. In order to address these limitations, we first build a covariance-based parameter sensitivity analysis framework to quantify the sensitivity of each parameter in a model to domain shifts. By computing the covariance of parameter gradients across multiple source domains, we can identify parameters that are more susceptible to domain variations, which serves as our theoretical foundation. Based on this, we propose Domain-Sensitive Parameter Regularization (DSP-Reg), a principled framework that guides model optimization by a soft regularization technique that encourages the model to rely more on domain-invariant parameters while suppressing those that are domain-specific. This approach provides a more granular control over the model's learning process, leading to improved robustness and generalization to unseen domains. Extensive experiments on benchmarks, such as PACS, VLCS, OfficeHome, and DomainNet, demonstrate that DSP-Reg outperforms state-of-the-art approaches, achieving an average accuracy of 66.7\% and surpassing all baselines.
Problem

Research questions and friction points this paper is trying to address.

Domain Generalization
Parameter Sensitivity
Domain Shift
Model Robustness
Unseen Domains
Innovation

Methods, ideas, or system contributions that make the work stand out.

Domain Generalization
Parameter Sensitivity
Covariance-based Analysis
Regularization
Domain-Invariant Learning
🔎 Similar Papers
No similar papers found.