FedDyMem: Efficient Federated Learning with Dynamic Memory and Memory-Reduce for Unsupervised Image Anomaly Detection

📅 2025-02-28
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Addressing challenges in federated unsupervised anomaly detection (UAD)—including skewed single-class data distributions, cross-client product heterogeneity, and stringent privacy constraints—this paper proposes a novel federated knowledge sharing paradigm based on a dynamic memory bank. Instead of uploading model parameters, clients upload learnable sample representations to a shared memory; memory compression and k-means-based global aggregation ensure representation consistency and communication efficiency while preserving privacy. We further introduce a dynamic memory generator, metric learning loss, and a federated feature alignment strategy to enhance discriminability and generalization. Evaluated on six industrial and medical datasets comprising 11 public benchmarks, our method achieves an average 5.2% AUC improvement over state-of-the-art federated UAD approaches, reduces communication overhead by over 60%, and establishes the first privacy-preserving federated UAD framework that simultaneously attains high accuracy, low bandwidth consumption, and strong generalization.

Technology Category

Application Category

📝 Abstract
Unsupervised image anomaly detection (UAD) has become a critical process in industrial and medical applications, but it faces growing challenges due to increasing concerns over data privacy. The limited class diversity inherent to one-class classification tasks, combined with distribution biases caused by variations in products across and within clients, poses significant challenges for preserving data privacy with federated UAD. Thus, this article proposes an efficient federated learning method with dynamic memory and memory-reduce for unsupervised image anomaly detection, called FedDyMem. Considering all client data belongs to a single class (i.e., normal sample) in UAD and the distribution of intra-class features demonstrates significant skewness, FedDyMem facilitates knowledge sharing between the client and server through the client's dynamic memory bank instead of model parameters. In the local clients, a memory generator and a metric loss are employed to improve the consistency of the feature distribution for normal samples, leveraging the local model to update the memory bank dynamically. For efficient communication, a memory-reduce method based on weighted averages is proposed to significantly decrease the scale of memory banks. On the server, global memory is constructed and distributed to individual clients through k-means aggregation. Experiments conducted on six industrial and medical datasets, comprising a mixture of six products or health screening types derived from eleven public datasets, demonstrate the effectiveness of FedDyMem.
Problem

Research questions and friction points this paper is trying to address.

Addresses data privacy in unsupervised image anomaly detection.
Improves feature consistency in one-class classification tasks.
Reduces communication costs with efficient memory management.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Dynamic memory bank for client-server knowledge sharing
Memory-reduce method using weighted averages for efficiency
K-means aggregation for global memory distribution
🔎 Similar Papers
No similar papers found.