Training-Free Active Learning Framework in Materials Science with Large Language Models

📅 2025-11-24
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Traditional machine learning approaches to active learning in materials science suffer from cold-start limitations and heavy dependence on domain-specific feature engineering, resulting in poor generalizability. To address these challenges, this work proposes a novel, training-free, large language model (LLM)-based prompt-driven active learning paradigm that directly leverages raw material text descriptions and numerical properties for experimental recommendation—eliminating reliance on initial labeled data and handcrafted features. We design two complementary prompting strategies: concise numerical prompts and extended textual prompts, enabling cross-dataset applicability, robustness, and computational efficiency in candidate screening. Evaluated on four diverse materials datasets, our method converges to high-performance regions using fewer than 30% of total experiments—substantially outperforming conventional active learning baselines. Results demonstrate superior efficiency, stability, and reproducibility, establishing a scalable, feature-agnostic framework for accelerated materials discovery.

Technology Category

Application Category

📝 Abstract
Active learning (AL) accelerates scientific discovery by prioritizing the most informative experiments, but traditional machine learning (ML) models used in AL suffer from cold-start limitations and domain-specific feature engineering, restricting their generalizability. Large language models (LLMs) offer a new paradigm by leveraging their pretrained knowledge and universal token-based representations to propose experiments directly from text-based descriptions. Here, we introduce an LLM-based active learning framework (LLM-AL) that operates in an iterative few-shot setting and benchmark it against conventional ML models across four diverse materials science datasets. We explored two prompting strategies: one using concise numerical inputs suited for datasets with more compositional and structured features, and another using expanded descriptive text suited for datasets with more experimental and procedural features to provide additional context. Across all datasets, LLM-AL could reduce the number of experiments needed to reach top-performing candidates by over 70% and consistently outperformed traditional ML models. We found that LLM-AL performs broader and more exploratory searches while still reaching the optima with fewer iterations. We further examined the stability boundaries of LLM-AL given the inherent non-determinism of LLMs and found its performance to be broadly consistent across runs, within the variability range typically observed for traditional ML approaches. These results demonstrate that LLM-AL can serve as a generalizable alternative to conventional AL pipelines for more efficient and interpretable experiment selection and potential LLM-driven autonomous discovery.
Problem

Research questions and friction points this paper is trying to address.

Addressing cold-start limitations in traditional active learning models
Reducing experiment iterations through training-free LLM frameworks
Overcoming domain-specific feature engineering with universal text representations
Innovation

Methods, ideas, or system contributions that make the work stand out.

Training-free framework using large language models
Iterative few-shot setting with two prompting strategies
Reduces experiments needed by over 70%
🔎 Similar Papers
No similar papers found.
H
Hongchen Wang
Department of Materials Science and Engineering, University of Toronto, ON, Canada
R
Rafael Espinosa Castañeda
Department of Materials Science and Engineering, University of Toronto, ON, Canada
J
Jay R. Werber
Department of Chemical Engineering, University of Toronto, ON, Canada
Yao Fehlis
Yao Fehlis
Artificial, Inc.
Computational ChemistryPlasmonicsMachine Learning
E
Edward Kim
Cohere, Toronto, ON, Canada
Jason Hattrick-Simpers
Jason Hattrick-Simpers
Department of Materials Science and Engineering University of Toronto
artificial intelligenceautonomous sciencecombinatorial materials sciencecompositionally complex alloysmetallic glasses