Large Language Models Are Innate Crystal Structure Generators

📅 2025-02-28
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Accelerating materials discovery requires computationally efficient and broadly applicable methods for generating stable crystalline structures, avoiding costly iterative fine-tuning of domain-specific models. Method: We propose MatLLMSearch—a framework that encodes crystal structures as natural-language sequences and synergistically combines evolutionary search with zero-shot generation from off-the-shelf pre-trained large language models (LLMs), augmented by multi-tier validation using machine-learned interatomic potentials (MLIPs) and density functional theory (DFT). Contribution/Results: We demonstrate, for the first time, that pre-trained LLMs intrinsically possess crystal-structure modeling capability—bypassing the need for domain-specific fine-tuning and enabling task-transferable, plug-and-play structural generation. Experiments show 78.38% of generated structures are metastable, and 31.7% are confirmed DFT-stable—outperforming CrystalTextLLM. The framework supports both structure prediction and multi-objective property optimization.

Technology Category

Application Category

📝 Abstract
Crystal structure generation is fundamental to materials discovery, enabling the prediction of novel materials with desired properties. While existing approaches leverage Large Language Models (LLMs) through extensive fine-tuning on materials databases, we show that pre-trained LLMs can inherently generate stable crystal structures without additional training. Our novel framework MatLLMSearch integrates pre-trained LLMs with evolutionary search algorithms, achieving a 78.38% metastable rate validated by machine learning interatomic potentials and 31.7% DFT-verified stability via quantum mechanical calculations, outperforming specialized models such as CrystalTextLLM. Beyond crystal structure generation, we further demonstrate that our framework can be readily adapted to diverse materials design tasks, including crystal structure prediction and multi-objective optimization of properties such as deformation energy and bulk modulus, all without fine-tuning. These results establish pre-trained LLMs as versatile and effective tools for materials discovery, opening up new venues for crystal structure generation with reduced computational overhead and broader accessibility.
Problem

Research questions and friction points this paper is trying to address.

Generates stable crystal structures without additional training
Integrates LLMs with evolutionary search for materials discovery
Adapts to diverse materials design tasks without fine-tuning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Pre-trained LLMs generate stable crystal structures.
MatLLMSearch integrates LLMs with evolutionary algorithms.
Framework adapts to diverse materials design tasks.
🔎 Similar Papers
No similar papers found.