Avoidance Decoding for Diverse Multi-Branch Story Generation

📅 2025-09-02
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address insufficient creative diversity and repetitive, monotonous outputs in large language models (LLMs) for story generation, this paper proposes an adaptive decoding strategy. The method integrates concept-level and narrative-level similarity penalties, employs a two-stage logits correction mechanism to dynamically balance diversity and coherence, and leverages neuron activation analysis to enhance creative modeling. Evaluated on multiple story generation benchmarks, our approach achieves up to a 2.6× improvement in output diversity and reduces repetition rates by an average of 30%, significantly mitigating text degeneration. The core contribution lies in the first integration of fine-grained semantic similarity modeling with neuron activation guidance, enabling controllable and interpretable multi-branch narrative generation.

Technology Category

Application Category

📝 Abstract
Large Language Models (LLMs) often generate repetitive and monotonous outputs, especially in tasks like story generation, due to limited creative diversity when given the same input prompt. To address this challenge, we propose a novel decoding strategy, Avoidance Decoding, that modifies token logits by penalizing similarity to previously generated outputs, thereby encouraging more diverse multi-branch stories. This penalty adaptively balances two similarity measures: (1) Concept-level Similarity Penalty, which is prioritized in early stages to diversify initial story concepts, and (2) Narrative-level Similarity Penalty, which is increasingly emphasized later to ensure natural yet diverse plot development. Notably, our method achieves up to 2.6 times higher output diversity and reduces repetition by an average of 30% compared to strong baselines, while effectively mitigating text degeneration. Furthermore, we reveal that our method activates a broader range of neurons, demonstrating that it leverages the model's intrinsic creativity.
Problem

Research questions and friction points this paper is trying to address.

Addresses repetitive story generation in LLMs
Enhances diversity in multi-branch narratives
Reduces text repetition while maintaining natural plots
Innovation

Methods, ideas, or system contributions that make the work stand out.

Avoidance Decoding penalizes similarity to previous outputs
Adaptively balances concept-level and narrative-level similarity penalties
Activates broader neural range to leverage intrinsic creativity
🔎 Similar Papers
No similar papers found.