Ethereum Price Prediction Employing Large Language Models for Short-term and Few-shot Forecasting

📅 2025-03-29
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the significant challenge of short-term Ethereum price forecasting under few-shot learning conditions. We propose a temporal transfer learning framework leveraging pretrained large language models (LLMs), marking the first cross-modal adaptation of LLMs—originally pretrained on natural language and image data—to cryptocurrency price prediction. Our method introduces temporal data adaptation and a hierarchical selective freezing strategy to align LLMs with financial time series, thereby enhancing generalization and few-shot adaptability. Extensive experiments demonstrate that our approach consistently outperforms classical statistical models (e.g., ARIMA), deep temporal models (e.g., LSTM, N-BEATS), and state-of-the-art methods across multiple metrics—including MSE, MAE, and RMSE—validating the efficacy and robustness of LLMs in financial time-series forecasting. This work establishes a novel paradigm for few-shot financial prediction.

Technology Category

Application Category

📝 Abstract
Cryptocurrencies have transformed financial markets with their innovative blockchain technology and volatile price movements, presenting both challenges and opportunities for predictive analytics. Ethereum, being one of the leading cryptocurrencies, has experienced significant market fluctuations, making its price prediction an attractive yet complex problem. This paper presents a comprehensive study on the effectiveness of Large Language Models (LLMs) in predicting Ethereum prices for short-term and few-shot forecasting scenarios. The main challenge in training models for time series analysis is the lack of data. We address this by leveraging a novel approach that adapts existing pre-trained LLMs on natural language or images from billions of tokens to the unique characteristics of Ethereum price time series data. Through thorough experimentation and comparison with traditional and contemporary models, our results demonstrate that selectively freezing certain layers of pre-trained LLMs achieves state-of-the-art performance in this domain. This approach consistently surpasses benchmarks across multiple metrics, including Mean Squared Error (MSE), Mean Absolute Error (MAE), and Root Mean Squared Error (RMSE), demonstrating its effectiveness and robustness. Our research not only contributes to the existing body of knowledge on LLMs but also provides practical insights in the cryptocurrency prediction domain. The adaptability of pre-trained LLMs to handle the nature of Ethereum prices suggests a promising direction for future research, potentially including the integration of sentiment analysis to further refine forecasting accuracy.
Problem

Research questions and friction points this paper is trying to address.

Predict Ethereum prices using LLMs for short-term forecasting
Address data scarcity in time series with pre-trained LLMs
Improve accuracy over traditional models with selective layer freezing
Innovation

Methods, ideas, or system contributions that make the work stand out.

Adapts pre-trained LLMs for Ethereum price prediction
Freezes selective LLM layers for optimal performance
Outperforms benchmarks in multiple error metrics
🔎 Similar Papers
No similar papers found.
E
Eftychia Makri
Department of Electrical Engineering, Yale University, New Haven, CT, USA
G
Georgios Palaiokrassas
Department of Electrical Engineering, Yale University, New Haven, CT, USA
S
Sarah Bouraga
Metis Lab, EM Normandie Business School, Paris, France
Antigoni Polychroniadou
Antigoni Polychroniadou
Executive Director, JPMorgan AI Research - Head of JPMorgan AlgoCRYPT CoE
Cryptography
Leandros Tassiulas
Leandros Tassiulas
Professor of Electrical Engineering, Yale University
Networkingwireless communicationssensor networksinternet protocols and architecturesstochastic optimization