DEO: Training-Free Direct Embedding Optimization for Negation-Aware Retrieval

📅 2026-03-10
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge that existing retrieval methods struggle to effectively handle queries containing negation or exclusion semantics. The authors propose a training-free embedding refinement approach that decomposes a query into positive and negative components and enforces a contrastive objective in the embedding space to accurately model negation. This method enables negation-aware retrieval without requiring fine-tuning or additional training data, substantially reducing deployment complexity and computational overhead. Experimental results demonstrate significant improvements: on the NegConstraint dataset, it achieves a 0.0738 gain in nDCG@10 and a 0.1028 improvement in MAP@100; in multimodal retrieval tasks, it outperforms CLIP by 6% in Recall@5.

Technology Category

Application Category

📝 Abstract
Recent advances in Large Language Models (LLMs) and Retrieval-Augmented Generation (RAG) have enabled diverse retrieval methods. However, existing retrieval methods often fail to accurately retrieve results for negation and exclusion queries. To address this limitation, prior approaches rely on embedding adaptation or fine-tuning, which introduce additional computational cost and deployment complexity. We propose Direct Embedding Optimization (DEO), a training-free method for negation-aware text and multimodal retrieval. DEO decomposes queries into positive and negative components and optimizes the query embedding with a contrastive objective. Without additional training data or model updates, DEO outperforms baselines on NegConstraint, with gains of +0.0738 nDCG@10 and +0.1028 MAP@100, while improving Recall@5 by +6\% over OpenAI CLIP in multimodal retrieval. These results demonstrate the practicality of DEO for negation- and exclusion-aware retrieval in real-world settings.
Problem

Research questions and friction points this paper is trying to address.

negation-aware retrieval
exclusion queries
retrieval accuracy
query understanding
multimodal retrieval
Innovation

Methods, ideas, or system contributions that make the work stand out.

Direct Embedding Optimization
Negation-Aware Retrieval
Training-Free
Contrastive Objective
Multimodal Retrieval
🔎 Similar Papers
No similar papers found.
Taegyeong Lee
Taegyeong Lee
NCLab, KAIST
Mobile SystemsSocial ComputingIoTEdge AI
Jiwon Park
Jiwon Park
UC Berkeley
programming languagessoftware engineering
S
Seunghyun Hwang
Department of Applied Data Science, Sungkyunkwan University
J
JooYoung Jang
Miri.DIH