OASIS: Order-Augmented Strategy for Improved Code Search

📅 2025-03-11
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Conventional contrastive learning paradigms for code search—e.g., InfoNCE—rely on binary positive/negative supervision and neglect fine-grained semantic distinctions among negative samples, limiting the discriminative power of code embeddings. Method: We propose an order-augmented contrastive learning framework that introduces ordered similarity supervision among negatives, enabling explicit modeling of semantic gradients within the negative set. Specifically, we design an Order-augmented InfoNCE loss that jointly captures deep semantic ordinal relationships between natural language queries and code snippets within a dual-encoder architecture. Contribution/Results: Our approach transcends coarse-grained positive/negative discrimination by explicitly leveraging intra-negative semantic hierarchy. Evaluated on multiple standard code search benchmarks, it consistently outperforms state-of-the-art methods, achieving up to an 8.2% absolute improvement in Recall@1. Results demonstrate both the effectiveness and generalizability of fine-grained ranking-aware representation learning for code search.

Technology Category

Application Category

📝 Abstract
Code embeddings capture the semantic representations of code and are crucial for various code-related large language model (LLM) applications, such as code search. Previous training primarily relies on optimizing the InfoNCE loss by comparing positive natural language (NL)-code pairs with in-batch negatives. However, due to the sparse nature of code contexts, training solely by comparing the major differences between positive and negative pairs may fail to capture deeper semantic nuances. To address this issue, we propose a novel order-augmented strategy for improved code search (OASIS). It leverages order-based similarity labels to train models to capture subtle differences in similarity among negative pairs. Extensive benchmark evaluations demonstrate that our OASIS model significantly outperforms previous state-of-the-art models focusing solely on major positive-negative differences. It underscores the value of exploiting subtle differences among negative pairs with order labels for effective code embedding training.
Problem

Research questions and friction points this paper is trying to address.

Improves code search by capturing deeper semantic nuances
Addresses sparse code context in training code embeddings
Leverages order-based similarity labels for better model training
Innovation

Methods, ideas, or system contributions that make the work stand out.

Order-augmented strategy for code search
Leverages order-based similarity labels
Captures subtle differences in negative pairs
🔎 Similar Papers
2024-08-18ACM Transactions on Knowledge Discovery from DataCitations: 1
Zuchen Gao
Zuchen Gao
Phd Candidate of The Hong Kong Polytechnic University
Z
Zizheng Zhan
Kwai Inc.
Xianming Li
Xianming Li
PhD candidate@PolyU, Baking@Mixedbread, Ex Algorithm Engineer@Alipay
Natural Language ProcessingSemantic Textual SimilarityInformation Retrieval
Erxin Yu
Erxin Yu
The Hong Kong Polytechnic University
H
Haotian Zhang
Kwai Inc.
B
Bin Chen
Kwai Inc.
Y
Yuqun Zhang
Southern University of Science and Technology
J
Jing Li
The Hong Kong Polytechnic University