LogiCoL: Logically-Informed Contrastive Learning for Set-based Dense Retrieval

📅 2025-05-26
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing dual-encoder dense retrievers often violate implicit set-theoretic constraints—such as subset or disjointness—when processing logical queries containing AND/OR/NOT operators, leading to inconsistent retrieval results. Method: We propose the first framework that explicitly models fuzzy logic t-norms as soft set-relational constraints (e.g., soft subset, soft disjointness) and integrates them into the contrastive learning objective of dual-encoder architectures. Our approach employs in-batch supervised contrastive learning to jointly optimize embedding spaces for both semantic relevance and logical structure in an end-to-end manner. Contribution/Results: Evaluated on entity retrieval, our method significantly improves Recall@k and logical consistency metrics across diverse logical query types. It demonstrates superior robustness and generalization under compositional logical complexity, without requiring architectural modifications or external logical reasoning modules.

Technology Category

Application Category

📝 Abstract
While significant progress has been made with dual- and bi-encoder dense retrievers, they often struggle on queries with logical connectives, a use case that is often overlooked yet important in downstream applications. Current dense retrievers struggle with such queries, such that the retrieved results do not respect the logical constraints implied in the queries. To address this challenge, we introduce LogiCoL, a logically-informed contrastive learning objective for dense retrievers. LogiCoL builds upon in-batch supervised contrastive learning, and learns dense retrievers to respect the subset and mutually-exclusive set relation between query results via two sets of soft constraints expressed via t-norm in the learning objective. We evaluate the effectiveness of LogiCoL on the task of entity retrieval, where the model is expected to retrieve a set of entities in Wikipedia that satisfy the implicit logical constraints in the query. We show that models trained with LogiCoL yield improvement both in terms of retrieval performance and logical consistency in the results. We provide detailed analysis and insights to uncover why queries with logical connectives are challenging for dense retrievers and why LogiCoL is most effective.
Problem

Research questions and friction points this paper is trying to address.

Dense retrievers struggle with queries containing logical connectives
Retrieved results often violate logical constraints in queries
Improving retrieval performance and logical consistency for entity retrieval
Innovation

Methods, ideas, or system contributions that make the work stand out.

Logically-informed contrastive learning for dense retrievers
Uses t-norm soft constraints for set relations
Improves retrieval performance and logical consistency
🔎 Similar Papers
No similar papers found.