Revealing the Attention Floating Mechanism in Masked Diffusion Models

📅 2026-01-12
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work uncovers the distinctive dynamics of attention mechanisms in masked diffusion models, revealing a fundamental divergence from autoregressive models. During denoising, shallow layers employ a phenomenon termed “attention floating” to leverage dispersed tokens for constructing global structure, while deeper layers concentrate on semantic content, establishing a two-stage paradigm of “shallow structural awareness and deep content focus.” Through attention visualization, ablation studies, and evaluation on knowledge-intensive tasks, this study provides the first systematic account of how this mechanism underpins strong in-context learning capabilities. Experiments demonstrate that masked diffusion models achieve up to twice the performance of autoregressive counterparts on such tasks, substantiating the critical role of the proposed attention floating mechanism.

Technology Category

Application Category

📝 Abstract
Masked diffusion models (MDMs), which leverage bidirectional attention and a denoising process, are narrowing the performance gap with autoregressive models (ARMs). However, their internal attention mechanisms remain under-explored. This paper investigates the attention behaviors in MDMs, revealing the phenomenon of Attention Floating. Unlike ARMs, where attention converges to a fixed sink, MDMs exhibit dynamic, dispersed attention anchors that shift across denoising steps and layers. Further analysis reveals its Shallow Structure-Aware, Deep Content-Focused attention mechanism: shallow layers utilize floating tokens to build a global structural framework, while deeper layers allocate more capability toward capturing semantic content. Empirically, this distinctive attention pattern provides a mechanistic explanation for the strong in-context learning capabilities of MDMs, allowing them to double the performance compared to ARMs in knowledge-intensive tasks. All codes and datasets are available at https://github.com/NEUIR/Attention-Floating.
Problem

Research questions and friction points this paper is trying to address.

Masked Diffusion Models
Attention Mechanism
Attention Floating
In-Context Learning
Autoregressive Models
Innovation

Methods, ideas, or system contributions that make the work stand out.

Attention Floating
Masked Diffusion Models
In-Context Learning
Denoising Process
Bidirectional Attention
🔎 Similar Papers
No similar papers found.