CPTuning: Contrastive Prompt Tuning for Generative Relation Extraction

📅 2025-01-04
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing generative relation extraction methods support only single-relation prediction, failing to model multi-relational overlap between entity pairs. To address this, we propose Contrastive Prompt Tuning (CPTuning), a novel framework that reformulates relation identification as a contrastive generation task with an explicit existence-threshold decision mechanism. CPTuning integrates vocabulary-constrained decoding and relation-template-guided generation to enable verifiable joint multi-relation extraction. Crucially, it is the first approach to explicitly model and learn a probabilistic relation existence threshold, thereby unifying treatment of both single- and multi-relation scenarios and departing from conventional deterministic relation assumptions. Evaluated on four benchmark datasets, T5-large enhanced with CPTuning achieves state-of-the-art performance in both single- and multi-relation extraction, significantly outperforming prior methods.

Technology Category

Application Category

📝 Abstract
Generative relation extraction (RE) commonly involves first reformulating RE as a linguistic modeling problem easily tackled with pre-trained language models (PLM) and then fine-tuning a PLM with supervised cross-entropy loss. Although having achieved promising performance, existing approaches assume only one deterministic relation between each pair of entities without considering real scenarios where multiple relations may be valid, i.e., entity pair overlap, causing their limited applications. To address this problem, we introduce a novel contrastive prompt tuning method for RE, CPTuning, which learns to associate a candidate relation between two in-context entities with a probability mass above or below a threshold, corresponding to whether the relation exists. Beyond learning schema, CPTuning also organizes RE as a verbalized relation generation task and uses Trie-constrained decoding to ensure a model generates valid relations. It adaptively picks out the generated candidate relations with a high estimated likelihood in inference, thereby achieving multi-relation extraction. We conduct extensive experiments on four widely used datasets to validate our method. Results show that T5-large fine-tuned with CPTuning significantly outperforms previous methods, regardless of single or multiple relations extraction.
Problem

Research questions and friction points this paper is trying to address.

Multi-label Relation Extraction
Generative Models
Complex Relationships
Innovation

Methods, ideas, or system contributions that make the work stand out.

CPTuning
Multi-Relation Extraction
Trie Decoding
🔎 Similar Papers
No similar papers found.
Jiaxin Duan
Jiaxin Duan
北京大学
机器学习,深度学习,自然语言处理
F
Fengyu Lu
Peking University, Beijing, China
J
Junfei Liu
Peking University, Beijing, China