When MoE Meets Blockchain: A Trustworthy Distributed Framework of Large Models

📅 2025-09-15
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the lack of trustworthy supervision in conventional cloud-based Mixture-of-Experts (MoE) models deployed in distributed edge networks—leading to unverifiable data interactions and vulnerability to tampering attacks—this paper proposes the first blockchain-empowered trustworthy distributed large language model framework. The framework adopts a three-tier collaborative architecture: (1) an edge layer for expert-parallel computation, (2) a blockchain layer leveraging consensus mechanisms and smart contracts to ensure data traceability, integrity, and immutability, and (3) a distributed storage layer enabling efficient and trustworthy data sharing. Crucially, blockchain is natively integrated into the MoE architecture, enabling end-to-end verifiable expert training and inference in decentralized environments for the first time. Experimental results demonstrate significantly enhanced robustness against data manipulation attacks and superior security compared to state-of-the-art distributed MoE approaches.

Technology Category

Application Category

📝 Abstract
As an enabling architecture of Large Models (LMs), Mixture of Experts (MoE) has become prevalent thanks to its sparsely-gated mechanism, which lowers computational overhead while maintaining learning performance comparable to dense LMs. The essence of MoE lies in utilizing a group of neural networks (called experts) with each specializing in different types of tasks, along with a trainable gating network that selectively activates a subset of these experts to handle specific tasks. Traditional cloud-based MoE encounters challenges such as prolonged response latency, high bandwidth consumption, and data privacy leakage. To address these issues, researchers have proposed to deploy MoE over distributed edge networks. However, a key concern of distributed MoE frameworks is the lack of trust in data interactions among distributed experts without the surveillance of any trusted authority, and thereby prone to potential attacks such as data manipulation. In response to the security issues of traditional distributed MoE, we propose a blockchain-aided trustworthy MoE (B-MoE) framework that consists of three layers: the edge layer, the blockchain layer, and the storage layer. In this framework, the edge layer employs the activated experts downloaded from the storage layer to process the learning tasks, while the blockchain layer functions as a decentralized trustworthy network to trace, verify, and record the computational results of the experts from the edge layer. The experimental results demonstrate that B-MoE is more robust to data manipulation attacks than traditional distributed MoE during both the training and inference processes.
Problem

Research questions and friction points this paper is trying to address.

Addressing trust issues in distributed MoE frameworks
Securing data interactions among experts without central authority
Preventing data manipulation attacks in training and inference
Innovation

Methods, ideas, or system contributions that make the work stand out.

Blockchain-aided trustworthy distributed MoE framework
Edge layer processes tasks with activated experts
Blockchain verifies and records expert computations
🔎 Similar Papers
No similar papers found.
Weihao Zhu
Weihao Zhu
University of Illinois Urbana-Champaign
approximation algorithmsgraph algorithmshardness of approximation
L
Long Shi
School of Electronic and Optical Engineering, Nanjing University of Science and Technology, Nanjing 210094, China
K
Kang Wei
School of Cyber Science and Engineering, Southeast University, Nanjing, 211189, China; Engineering Research Center of Blockchain Application, Supervision and Management (Southeast University), Ministry of Education
Zhen Mei
Zhen Mei
Associate Professor, Nanjing University of Science and Technology
Information TheoryDeep LearningIn-Memory ComputingError Correction Code
Z
Zhe Wang
School of Computer Science and Engineering, Nanjing University of Science and Technology, Nanjing, 210094, China
J
Jiaheng Wang
National Mobile Communications Research Laboratory, Southeast University, Nanjing 211102, China; Purple Mountain Laboratories, Nanjing 211111, China
J
Jun Li
National Mobile Communications Research Laboratory, Southeast University, China