Context-Enhanced Contrastive Search for Improved LLM Text Generation

📅 2025-04-22
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Large language models (LLMs) struggle to simultaneously maintain coherence, diversity, and topic relevance in long-text generation, often suffering from repetition and semantic drift. To address this, we propose Context-Enhanced Contrastive Search (CECS), a novel decoding method. Its core contributions are: (1) a dynamic context importance weighting mechanism that quantifies the semantic contribution of historical segments in real time; (2) a multi-level contrastive search architecture that jointly enforces local coherence and global topic consistency through hierarchical constraints; and (3) an adaptive temperature control strategy that modulates sampling stochasticity based on contextual entropy. Evaluated on legal document drafting, customer service dialogue, and marketing copy generation, CECS significantly outperforms existing contrastive search variants across BLEU, ROUGE-L, and semantic similarity metrics. Generated texts exhibit markedly improved topic focus and logical coherence.

Technology Category

Application Category

📝 Abstract
Recently, Large Language Models (LLMs) have demonstrated remarkable advancements in Natural Language Processing (NLP). However, generating high-quality text that balances coherence, diversity, and relevance remains challenging. Traditional decoding methods, such as bean search and top-k sampling, often struggle with either repetitive or incoherent outputs, particularly in tasks that require long-form text generation. To address these limitations, the paper proposes a novel enhancement of the well-known Contrastive Search algorithm, Context-Enhanced Contrastive Search (CECS) with contextual calibration. The proposed scheme introduces several novelties including dynamic contextual importance weighting, multi-level Contrastive Search, and adaptive temperature control, to optimize the balance between fluency, creativity, and precision. The performance of CECS is evaluated using several standard metrics such as BLEU, ROUGE, and semantic similarity. Experimental results demonstrate significant improvements in both coherence and relevance of the generated texts by CECS outperforming the existing Contrastive Search techniques. The proposed algorithm has several potential applications in the real world including legal document drafting, customer service chatbots, and content marketing.
Problem

Research questions and friction points this paper is trying to address.

Improving coherence and relevance in LLM text generation
Addressing repetitive or incoherent outputs in long-form generation
Balancing fluency, creativity, and precision in generated texts
Innovation

Methods, ideas, or system contributions that make the work stand out.

Dynamic contextual importance weighting for relevance
Multi-level Contrastive Search for coherence
Adaptive temperature control for balance