Optimizing Knowledge Utilization for Multi-Intent Comment Generation with Large Language Models

📅 2025-10-29
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing code comment generation methods struggle to model the “intent–code–comment” ternary relationship under few-shot settings and fail to accommodate developers’ diverse intent requirements. To address this, we propose KUMIC, a multi-intent-oriented comment generation framework. Its core contributions are: (1) a code–comment consistency retrieval mechanism that selects highly relevant in-context examples; and (2) a chain-of-thought–based mapping knowledge chain that explicitly models the reasoning path from program intent and code structure to comment style. By integrating in-context learning with knowledge-chain guidance, KUMIC significantly enhances large language models’ multi-intent comprehension and generation capabilities in low-resource scenarios. Experiments demonstrate that KUMIC outperforms state-of-the-art methods by 14.49% (BLEU), 22.41% (METEOR), 20.72% (ROUGE-L), and 12.94% (SBERT), validating its effectiveness and generalizability.

Technology Category

Application Category

📝 Abstract
Code comment generation aims to produce a generic overview of a code snippet, helping developers understand and maintain code. However, generic summaries alone are insufficient to meet the diverse needs of practitioners; for example, developers expect the implementation insights to be presented in an untangled manner, while users seek clear usage instructions. This highlights the necessity of multi-intent comment generation. With the widespread adoption of Large Language Models (LLMs) for code-related tasks, these models have been leveraged to tackle the challenge of multi-intent comment generation. Despite their successes, state-of-the-art LLM-based approaches often struggle to construct correct relationships among intents, code, and comments within a smaller number of demonstration examples. To mitigate this issue, we propose a framework named KUMIC for multi-intent comment generation. Built upon in-context learning, KUMIC leverages Chain-of-Thought (CoT) to optimize knowledge utilization for LLMs to generate intent-specific comments. Specifically, KUMIC first designs a retrieval mechanism to obtain similar demonstration examples, which exhibit high code-comment consistency. Then, KUMIC leverages CoT to guide LLMs to focus on statements facilitating the derivation of code comments aligned with specific intents. In this context, KUMIC constructs a mapping knowledge chain, linking code to intent-specific statements to comments, which enables LLMs to follow similar reasoning steps when generating the desired comments. We conduct extensive experiments to evaluate KUMIC, and the results demonstrate that KUMIC outperforms state-of-the-art baselines by 14.49%, 22.41%, 20.72%, and 12.94% in terms of BLEU, METEOR, ROUGE-L, and SBERT, respectively.
Problem

Research questions and friction points this paper is trying to address.

Optimizing knowledge utilization for multi-intent comment generation
Addressing LLM struggles with intent-code-comment relationship construction
Generating intent-specific comments using Chain-of-Thought reasoning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses Chain-of-Thought to optimize knowledge utilization
Retrieves similar examples with high code-comment consistency
Links code to intent-specific statements for comment generation
🔎 Similar Papers
No similar papers found.
S
Shuochuan Li
School of Computer Software and School of New Media and Communication, Tianjin University, Tianjin, China
Z
Zan Wang
School of Computer Software and School of New Media and Communication, Tianjin University, Tianjin, China
Xiaoning Du
Xiaoning Du
Senior Lecturer (equivalent to U.S. Associate Professor), Monash University
Software EngineeringArtificial IntelligenceCybersecurityRuntime Verification
Z
Zhuo Wu
School of Computer Software and School of New Media and Communication, Tianjin University, Tianjin, China
J
Jiuqiao Yu
UC Berkeley Engineering, University of California, Berkeley
J
Junjie Chen
School of Computer Software, Tianjin University, Tianjin, China