Drift-aware Collaborative Assistance Mixture of Experts for Heterogeneous Multistream Learning

๐Ÿ“… 2025-08-03
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
To address the challenges of online learning under dynamic concept drift and inter-stream heterogeneity in heterogeneous multi-data-stream settings, this paper proposes the CAMEL framework. Methodologically, CAMEL (1) assigns each stream an independent feature extractor and task head to explicitly model stream-wise heterogeneity; (2) introduces a collaborative mixture-of-experts mechanism augmented with multi-head attention for context-aware knowledge coordination and targeted transfer; and (3) incorporates an autonomous expert tuning strategy that enables dynamic expert instantiation, incremental updating, and pruningโ€”thereby jointly enhancing adaptability to concept drift and mitigating catastrophic forgetting. Extensive experiments across diverse multi-stream benchmarks demonstrate that CAMEL consistently outperforms state-of-the-art methods, achieving significant improvements in generalization, robustness, and continual learning efficiency.

Technology Category

Application Category

๐Ÿ“ Abstract
Learning from multiple data streams in real-world scenarios is fundamentally challenging due to intrinsic heterogeneity and unpredictable concept drifts. Existing methods typically assume homogeneous streams and employ static architectures with indiscriminate knowledge fusion, limiting generalizability in complex dynamic environments. To tackle this gap, we propose CAMEL, a dynamic extbf{C}ollaborative extbf{A}ssistance extbf{M}ixture of extbf{E}xperts extbf{L}earning framework. It addresses heterogeneity by assigning each stream an independent system with a dedicated feature extractor and task-specific head. Meanwhile, a dynamic pool of specialized private experts captures stream-specific idiosyncratic patterns. Crucially, collaboration across these heterogeneous streams is enabled by a dedicated assistance expert. This expert employs a multi-head attention mechanism to distill and integrate relevant context autonomously from all other concurrent streams. It facilitates targeted knowledge transfer while inherently mitigating negative transfer from irrelevant sources. Furthermore, we propose an Autonomous Expert Tuner (AET) strategy, which dynamically manages expert lifecycles in response to drift. It instantiates new experts for emerging concepts (freezing prior ones to prevent catastrophic forgetting) and prunes obsolete ones. This expert-level plasticity provides a robust and efficient mechanism for online model capacity adaptation. Extensive experiments demonstrate CAMEL's superior generalizability across diverse multistreams and exceptional resilience against complex concept drifts.
Problem

Research questions and friction points this paper is trying to address.

Addresses heterogeneity in multistream learning with dedicated systems
Enables dynamic collaboration across streams using attention mechanisms
Manages concept drifts via autonomous expert lifecycle adaptation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Dynamic Mixture of Experts for heterogeneous streams
Multi-head attention for autonomous context integration
Autonomous Expert Tuner for drift adaptation
๐Ÿ”Ž Similar Papers
No similar papers found.
E
En Yu
Australian Artificial Intelligence Institute (AAII), University of Technology Sydney (UTS), Australia
J
Jie Lu
Australian Artificial Intelligence Institute (AAII), University of Technology Sydney (UTS), Australia
K
Kun Wang
Australian Artificial Intelligence Institute (AAII), University of Technology Sydney (UTS), Australia
Xiaoyu Yang
Xiaoyu Yang
University of Cambridge
Speech recognitionmachine learning
Guangquan Zhang
Guangquan Zhang
University of Technology Sydney, Australia
fuzzy sets and systemsmachine learningdecision support systems