Beyond Efficiency: A Systematic Survey of Resource-Efficient Large Language Models

📅 2024-01-01
🏛️ arXiv.org
📈 Citations: 35
Influential: 0
📄 PDF
🤖 AI Summary
Large language models (LLMs) incur prohibitive computational, memory, energy, network, and financial costs, severely limiting their deployment and sustainable adoption in resource-constrained environments. To address this, we propose the first fine-grained, resource-type–aware efficiency optimization framework for LLMs, systematically modeling multi-resource interdependencies across the full lifecycle—architecture design, pretraining, fine-tuning, and inference deployment. We introduce a “resource–technique” mapping framework and a unified evaluation benchmark, including an open-source dataset. Our approach innovatively integrates model compression, sparsification, quantization, knowledge distillation, efficient attention mechanisms, hardware-aware compilation, and green training techniques, yielding a comprehensive technology landscape spanning five resource dimensions and four development stages. The resulting framework provides standardized evaluation tools and an open research paradigm to advance efficient, sustainable LLM development and deployment.

Technology Category

Application Category

📝 Abstract
The burgeoning field of Large Language Models (LLMs), exemplified by sophisticated models like OpenAI's ChatGPT, represents a significant advancement in artificial intelligence. These models, however, bring forth substantial challenges in the high consumption of computational, memory, energy, and financial resources, especially in environments with limited resource capabilities. This survey aims to systematically address these challenges by reviewing a broad spectrum of techniques designed to enhance the resource efficiency of LLMs. We categorize methods based on their optimization focus: computational, memory, energy, financial, and network resources and their applicability across various stages of an LLM's lifecycle, including architecture design, pretraining, finetuning, and system design. Additionally, the survey introduces a nuanced categorization of resource efficiency techniques by their specific resource types, which uncovers the intricate relationships and mappings between various resources and corresponding optimization techniques. A standardized set of evaluation metrics and datasets is also presented to facilitate consistent and fair comparisons across different models and techniques. By offering a comprehensive overview of the current sota and identifying open research avenues, this survey serves as a foundational reference for researchers and practitioners, aiding them in developing more sustainable and efficient LLMs in a rapidly evolving landscape.
Problem

Research questions and friction points this paper is trying to address.

Resource Constraints
Large Language Models
Sustainability
Innovation

Methods, ideas, or system contributions that make the work stand out.

Resource-efficient Models
Standardized Evaluation
Sustainable AI
🔎 Similar Papers
No similar papers found.
Guangji Bai
Guangji Bai
Applied Scientist, Amazon
Machine LearningLLM EfficiencyModel Pruning
Z
Zheng Chai
School of Data Science and Department of Computer Science, University of Virginia, 1827 University Avenue, Charlottesville, 22904, VA, United States
C
Chen Ling
Department of Computer Science, Emory University, 201 Dowman Dr, Atlanta, 30322, GA, United States
S
Shiyu Wang
Department of Computer Science, Emory University, 201 Dowman Dr, Atlanta, 30322, GA, United States
Jiaying Lu
Jiaying Lu
Research Assistant Professor of School of Nursing's Center for Data Science, at Emory University
AI for HealthcareKnowledge GraphMultimodal LearningLarge Language Model
N
Nan Zhang
College of Information Sciences and Technology, Pennsylvania State University, 201 Old Main, University Park, 16802, PA, United States
T
Tingwei Shi
Department of Computer Science, Emory University, 201 Dowman Dr, Atlanta, 30322, GA, United States
Z
Ziyang Yu
Department of Computer Science, Emory University, 201 Dowman Dr, Atlanta, 30322, GA, United States
M
Mengdan Zhu
Department of Computer Science, Emory University, 201 Dowman Dr, Atlanta, 30322, GA, United States
Y
Yifei Zhang
Department of Computer Science, Emory University, 201 Dowman Dr, Atlanta, 30322, GA, United States
Carl Yang
Carl Yang
Waymo LLC, PhD at University of California, Davis
GPU ComputingParallel ComputingGraph Processing
Y
Yue Cheng
School of Data Science and Department of Computer Science, University of Virginia, 1827 University Avenue, Charlottesville, 22904, VA, United States
L
Liang Zhao
Department of Computer Science, Emory University, 201 Dowman Dr, Atlanta, 30322, GA, United States