ToolWeaver: Weaving Collaborative Semantics for Scalable Tool Use in Large Language Models

📅 2026-01-29
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the limitations of existing retrieval-based and generative approaches to tool calling: the former struggle to model complex semantics, while the latter assign a unique token to each tool, leading to vocabulary explosion, poor generalization, and difficulty in learning collaborative semantics. To overcome these challenges, the authors propose ToolWeaver, a novel framework that encodes tools as hierarchical sequences, enabling vocabulary size to grow logarithmically with the number of tools. By leveraging dense co-occurrence of shared substructures, ToolWeaver effectively learns collaborative semantics among tools. Integrated with generative alignment fine-tuning, this approach seamlessly incorporates structured tool representations into large language models. Experiments on nearly 47,000 tools demonstrate that ToolWeaver significantly outperforms state-of-the-art methods, achieving breakthroughs in scalability, generalization, and collaborative semantic understanding.

Technology Category

Application Category

📝 Abstract
Prevalent retrieval-based tool-use pipelines struggle with a dual semantic challenge: their retrievers often employ encoders that fail to capture complex semantics, while the Large Language Model (LLM) itself lacks intrinsic tool knowledge from its natural language pretraining. Generative methods offer a powerful alternative by unifying selection and execution, tasking the LLM to directly learn and generate tool identifiers. However, the common practice of mapping each tool to a unique new token introduces substantial limitations: it creates a scalability and generalization crisis, as the vocabulary size explodes and each tool is assigned a semantically isolated token. This approach also creates a semantic bottleneck that hinders the learning of collaborative tool relationships, as the model must infer them from sparse co-occurrences of monolithic tool IDs within a vast library. To address these limitations, we propose ToolWeaver, a novel generative tool learning framework that encodes tools into hierarchical sequences. This approach makes vocabulary expansion logarithmic to the number of tools. Crucially, it enables the model to learn collaborative patterns from the dense co-occurrence of shared codes, rather than the sparse co-occurrence of monolithic tool IDs. We generate these structured codes through a novel tokenization process designed to weave together a tool's intrinsic semantics with its extrinsic co-usage patterns. These structured codes are then integrated into the LLM through a generative alignment stage, where the model is fine-tuned to produce the hierarchical code sequences. Evaluation results with nearly 47,000 tools show that ToolWeaver significantly outperforms state-of-the-art methods, establishing a more scalable, generalizable, and semantically-aware foundation for advanced tool-augmented agents.
Problem

Research questions and friction points this paper is trying to address.

tool use
semantic representation
scalability
large language models
collaborative semantics
Innovation

Methods, ideas, or system contributions that make the work stand out.

hierarchical tool encoding
generative tool learning
structured code tokenization
collaborative semantics
scalable tool use
🔎 Similar Papers
No similar papers found.
B
Bowen Fang
New Laboratory of Pattern Recognition (NLPR), Institute of Automation, Chinese Academy of Sciences (CASIA); School of Artificial Intelligence, University of Chinese Academy of Sciences; Zhongguancun Academy
W
Wen Ye
New Laboratory of Pattern Recognition (NLPR), Institute of Automation, Chinese Academy of Sciences (CASIA); School of Artificial Intelligence, University of Chinese Academy of Sciences
Yunyue Su
Yunyue Su
Institute of Automation, Chinese Academy of Sciences
Tool AgentMultimodal LLMsAI for ScienceInformation ExtractionTrust Worthy AI
Jinghao Zhang
Jinghao Zhang
Kuaishou Tech
Recommender SystemsMultimediaLarge Language Model
Qiang Liu
Qiang Liu
Institute of Automation, Chinese Academy of Sciences
Data MiningMultimodal LLMsAI for Science
Y
Yesheng Liu
New Laboratory of Pattern Recognition (NLPR), Institute of Automation, Chinese Academy of Sciences (CASIA); School of Artificial Intelligence, University of Chinese Academy of Sciences
J
Jiabing Yang
New Laboratory of Pattern Recognition (NLPR), Institute of Automation, Chinese Academy of Sciences (CASIA); School of Artificial Intelligence, University of Chinese Academy of Sciences
X
Xin Sun
New Laboratory of Pattern Recognition (NLPR), Institute of Automation, Chinese Academy of Sciences (CASIA)
S
Shu Wu
New Laboratory of Pattern Recognition (NLPR), Institute of Automation, Chinese Academy of Sciences (CASIA)
B
Baole Wei
Zhongguancun Academy; Zhongguancun Institute of Artificial Intelligence
Liang Wang
Liang Wang
National Lab of Pattern Recognition
Computer VisionPattern RecognitionMachine Learning