DiffER: Diffusion Entity-Relation Modeling for Reversal Curse in Diffusion Large Language Models

📅 2026-01-12
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the “reversal curse” in diffusion-based large language models (DLLMs)—a persistent unidirectional bias in modeling logically bidirectional entity relations. Through systematic analysis, we identify entity fragmentation, data asymmetry, and missing relational signals as primary causes. To mitigate this limitation, we propose DiffER, a novel approach that introduces entity-aware training and relation-enhanced data construction, thereby transcending the constraints of conventional autoregressive assumptions. Our method employs holistic entity masking and distribution-symmetric data augmentation to better align model learning with bidirectional relational semantics. Experimental results demonstrate that DiffER substantially alleviates the reversal curse, significantly improving DLLMs’ capacity for bidirectional relational reasoning across multiple benchmark tasks.

Technology Category

Application Category

📝 Abstract
The"reversal curse"refers to the phenomenon where large language models (LLMs) exhibit predominantly unidirectional behavior when processing logically bidirectional relationships. Prior work attributed this to autoregressive training -- predicting the next token inherently favors left-to-right information flow over genuine bidirectional knowledge associations. However, we observe that Diffusion LLMs (DLLMs), despite being trained bidirectionally, also suffer from the reversal curse. To investigate the root causes, we conduct systematic experiments on DLLMs and identify three key reasons: 1) entity fragmentation during training, 2) data asymmetry, and 3) missing entity relations. Motivated by the analysis of these reasons, we propose Diffusion Entity-Relation Modeling (DiffER), which addresses the reversal curse through entity-aware training and balanced data construction. Specifically, DiffER introduces whole-entity masking, which mitigates entity fragmentation by predicting complete entities in a single step. DiffER further employs distribution-symmetric and relation-enhanced data construction strategies to alleviate data asymmetry and missing relations. Extensive experiments demonstrate that DiffER effectively alleviates the reversal curse in Diffusion LLMs, offering new perspectives for future research.
Problem

Research questions and friction points this paper is trying to address.

reversal curse
Diffusion LLMs
entity fragmentation
data asymmetry
missing entity relations
Innovation

Methods, ideas, or system contributions that make the work stand out.

Diffusion LLMs
reversal curse
entity-aware training
whole-entity masking
relation-enhanced data
🔎 Similar Papers
No similar papers found.
S
Shaokai He
Chongqing University
K
Kaiwen Wei
Chongqing University
Xinyi Zeng
Xinyi Zeng
Sichuan University
Medical Image SegmentationMedical Image ReconstructionMulti-modal Learning
Xiang Chen
Xiang Chen
Nanjing University of Science and Technology
Computer VisionImage ProcessingArtificial IntelligenceDeep Learning
X
Xue Yang
Shanghai Jiao Tong University
Z
Zhenyang Li
Chongqing University
J
Jiang Zhong
Chongqing University
Y
Yu Tian
Tsinghua University