Online Learning of HTN Methods for integrated LLM-HTN Planning

📅 2025-11-16
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Large language models (LLMs) such as ChatGPT are frequently invoked in LLM-driven Hierarchical Task Network (HTN) planning systems, leading to high computational cost and low efficiency. Method: This paper proposes an online inductive HTN method learning mechanism within the ChatHTN framework. It automatically extracts reusable, hierarchical task decomposition rules from LLM-generated planning traces and dynamically updates the HTN method library in a memoization-like fashion, enabling cross-task knowledge transfer. Crucially, the LLM serves as a “temporary teacher,” with lightweight inductive learning replacing repeated LLM invocations. Contribution/Results: Evaluated across two domains, the approach reduces average ChatGPT calls by 62.3%, improves task-solving success rate by 4.7%, and enhances both generalization capability and planning efficiency—significantly lowering reliance on costly LLM inference while preserving solution quality.

Technology Category

Application Category

📝 Abstract
We present online learning of Hierarchical Task Network (HTN) methods in the context of integrated HTN planning and LLM-based chatbots. Methods indicate when and how to decompose tasks into subtasks. Our method learner is built on top of the ChatHTN planner. ChatHTN queries ChatGPT to generate a decomposition of a task into primitive tasks when no applicable method for the task is available. In this work, we extend ChatHTN. Namely, when ChatGPT generates a task decomposition, ChatHTN learns from it, akin to memoization. However, unlike memoization, it learns a generalized method that applies not only to the specific instance encountered, but to other instances of the same task. We conduct experiments on two domains and demonstrate that our online learning procedure reduces the number of calls to ChatGPT while solving at least as many problems, and in some cases, even more.
Problem

Research questions and friction points this paper is trying to address.

Learns generalized HTN methods from ChatGPT decompositions online
Reduces ChatGPT dependency while maintaining problem-solving performance
Extends ChatHTN planner by converting instance-specific solutions to reusable methods
Innovation

Methods, ideas, or system contributions that make the work stand out.

Online learning of generalized HTN methods
Reduces ChatGPT calls via method generalization
Integrates LLM-based decomposition with HTN planning
🔎 Similar Papers
No similar papers found.