Graph Neural Architecture Search with GPT-4

πŸ“… 2023-09-30
πŸ›οΈ arXiv.org
πŸ“ˆ Citations: 32
✨ Influential: 2
πŸ“„ PDF
πŸ€– AI Summary
Existing graph neural architecture search (GNAS) methods rely heavily on manually designed search spaces and domain-specific optimization strategies, imposing high barriers to entry. To address this, we propose the first GPT-4–driven semantic GNAS framework. Our approach abandons hand-crafted discrete search spaces and explicit optimization policies; instead, it employs customized prompt engineering to guide a large language model (LLM) to iteratively generate executable graph neural network architectures, augmented by lightweight validation feedback for end-to-end automated design. Crucially, we recast architecture search as a natural language reasoning task, drastically reducing dependence on domain expertise. Evaluated on multiple standard graph benchmarks, the automatically generated architectures achieve superior accuracy compared to state-of-the-art GNAS methods, while also demonstrating significantly faster training convergence. These results validate the effectiveness and scalability of our semantic, knowledge-light GNAS paradigm.
πŸ“ Abstract
Graph Neural Architecture Search (GNAS) has shown promising results in automatically designing graph neural networks. However, GNAS still requires intensive human labor with rich domain knowledge to design the search space and search strategy. In this paper, we integrate GPT-4 into GNAS and propose a new GPT-4 based Graph Neural Architecture Search method (GPT4GNAS for short). The basic idea of our method is to design a new class of prompts for GPT-4 to guide GPT-4 toward the generative task of graph neural architectures. The prompts consist of descriptions of the search space, search strategy, and search feedback of GNAS. By iteratively running GPT-4 with the prompts, GPT4GNAS generates more accurate graph neural networks with fast convergence. Experimental results show that embedding GPT-4 into GNAS outperforms the state-of-the-art GNAS methods.
Problem

Research questions and friction points this paper is trying to address.

Automating graph neural architecture search to reduce human effort
Enhancing GNAS with Large Language Models for better performance
Improving search space and strategy understanding through specialized prompts
Innovation

Methods, ideas, or system contributions that make the work stand out.

Integrates Large Language Models into Graph Neural Architecture Search
Uses GNAS prompts to guide LLM understanding of architecture generation
Iteratively runs LLMs with prompts for fast convergence
Haishuai Wang
Haishuai Wang
Harvard University
Data MiningMachine Learning
Y
Yang Gao
College of Computer Science and Technology, Zhejiang University, China
X
Xin-Min Zheng
College of Computer Science and Technology, Zhejiang University, China
P
Peng Zhang
Cyberspace Institute of Advanced Technology, Guangzhou University, China
Hongyang Chen
Hongyang Chen
SUN YAT-SEN UNIVERSITY
SDNCloud ComputingMicroserviceAIOps
J
Jiajun Bu
College of Computer Science and Technology, Zhejiang University, China