From Prompts to Propositions: A Logic-Based Lens on Student-LLM Interactions

📅 2025-04-25
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses students’ practical use of large language models (LLMs) for solving programming problems in computing education, focusing on the evolution of their prompting behaviors and the identification of persistent difficulties. Method: We propose Prompt2Constraints—a novel method that automatically translates natural-language prompts into propositional logic constraints, enabling semantically grounded, quantifiable, and logically tractable modeling. Contribution/Results: Empirical analysis of 1,872 prompts from 203 students reveals a previously unobserved “strategic prompt transition” phenomenon in failed attempts and pinpoints critical intervention nodes. Our framework effectively distinguishes struggling students—whose prompt modifications exhibit significantly greater magnitude—and accurately identifies high-value, real-time intervention opportunities. These findings provide both theoretical foundations and technical pathways for LLM-augmented intelligent programming instruction.

Technology Category

Application Category

📝 Abstract
Background and Context. The increasing integration of large language models (LLMs) in computing education presents an emerging challenge in understanding how students use LLMs and craft prompts to solve computational tasks. Prior research has used both qualitative and quantitative methods to analyze prompting behavior, but these approaches lack scalability or fail to effectively capture the semantic evolution of prompts. Objective. In this paper, we investigate whether students prompts can be systematically analyzed using propositional logic constraints. We examine whether this approach can identify patterns in prompt evolution, detect struggling students, and provide insights into effective and ineffective strategies. Method. We introduce Prompt2Constraints, a novel method that translates students prompts into logical constraints. The constraints are able to represent the intent of the prompts in succinct and quantifiable ways. We used this approach to analyze a dataset of 1,872 prompts from 203 students solving introductory programming tasks. Findings. We find that while successful and unsuccessful attempts tend to use a similar number of constraints overall, when students fail, they often modify their prompts more significantly, shifting problem-solving strategies midway. We also identify points where specific interventions could be most helpful to students for refining their prompts. Implications. This work offers a new and scalable way to detect students who struggle in solving natural language programming tasks. This work could be extended to investigate more complex tasks and integrated into programming tools to provide real-time support.
Problem

Research questions and friction points this paper is trying to address.

Analyzing student-LLM interactions using propositional logic constraints
Identifying patterns in prompt evolution and struggling students
Providing scalable insights for effective prompt refinement strategies
Innovation

Methods, ideas, or system contributions that make the work stand out.

Translates prompts into logical constraints
Analyzes prompt evolution patterns
Detects struggling students effectively
🔎 Similar Papers
No similar papers found.