🤖 AI Summary
This work addresses the challenge of effectively fusing heterogeneous language models, which is hindered by architectural disparities, misaligned parameter spaces, and conflicting knowledge. To overcome these limitations, the authors propose HeteroFusion, a novel approach that replaces conventional parameter matching with topological alignment of functional modules and incorporates a conflict-aware denoising mechanism to suppress incompatible signals. HeteroFusion enables, for the first time, efficient fusion across distinct model families such as Llama, Qwen, and Mistral. By integrating adapter-based preservation with structured parameter update strategies, the method substantially enhances fusion stability and cross-family generalization. Experimental results demonstrate that HeteroFusion consistently outperforms existing baselines in heterogeneous model fusion, multi-source ensemble tasks, and robustness to noisy inputs.
📝 Abstract
Model merging aims to integrate multiple expert models into a single model that inherits their complementary strengths without incurring the inference-time cost of ensembling. Recent progress has shown that merging can be highly effective when all source models are \emph{homogeneous}, i.e., derived from the same pretrained backbone and therefore share aligned parameter coordinates or compatible task vectors. Yet this assumption is increasingly unrealistic in open model ecosystems, where useful experts are often built on different families such as Llama, Qwen, and Mistral. In such \emph{heterogeneous} settings, direct weight-space fusion becomes ill-posed due to architectural mismatch, latent basis misalignment, and amplified cross-source conflict. We address this problem with \texttt{HeteroFusion} for heterogeneous language model fusion, which consists of two key components: topology-based alignment that transfers knowledge across heterogeneous backbones by matching functional module structures instead of raw tensor coordinates, and conflict-aware denoising that suppresses incompatible or noisy transfer signals during fusion. We further provide analytical justification showing that preserving the target adapter basis while predicting structured updates leads to a stable and well-conditioned transfer process. Across heterogeneous transfer, multi-source fusion, noisy-source robustness, and cross-family generalization settings, \texttt{HeteroFusion} consistently outperforms strong merging, fusion, and ensemble baselines.