SeBS-Flow: Benchmarking Serverless Cloud Function Workflows

📅 2024-10-04
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing serverless workflow platforms exhibit significant disparities in programming models and infrastructure, undermining fairness and consistency in cross-platform performance evaluation. Method: We introduce the first platform-agnostic serverless workflow benchmark suite, featuring a unified workflow modeling methodology that encompasses six real-world applications and four microbenchmarks representing distinct computational patterns. The suite integrates standardized workflow descriptions, multi-cloud deployment adapters, end-to-end monitoring, and cost modeling. Contribution/Results: We systematically evaluate performance, cost, scalability, and runtime deviation across AWS, Azure, and GCP. Our open-source implementation enables reproducible, comparable benchmarking, establishing a community standard for fair, cross-platform evaluation—thereby addressing a critical gap in serverless workflow research and practice.

Technology Category

Application Category

📝 Abstract
Serverless computing has emerged as a prominent paradigm, with a significant adoption rate among cloud customers. While this model offers advantages such as abstraction from the deployment and resource scheduling, it also poses limitations in handling complex use cases due to the restricted nature of individual functions. Serverless workflows address this limitation by orchestrating multiple functions into a cohesive application. However, existing serverless workflow platforms exhibit significant differences in their programming models and infrastructure, making fair and consistent performance evaluations difficult in practice. To address this gap, we propose the first serverless workflow benchmarking suite SeBS-Flow, providing a platform-agnostic workflow model that enables consistent benchmarking across various platforms. SeBS-Flow includes six real-world application benchmarks and four microbenchmarks representing different computational patterns. We conduct comprehensive evaluations on three major cloud platforms, assessing performance, cost, scalability, and runtime deviations. We make our benchmark suite open-source, enabling rigorous and comparable evaluations of serverless workflows over time.
Problem

Research questions and friction points this paper is trying to address.

Evaluating performance differences in serverless workflow platforms
Lack of consistent benchmarking for serverless function workflows
Addressing complex use case limitations in serverless computing
Innovation

Methods, ideas, or system contributions that make the work stand out.

Platform-agnostic serverless workflow benchmarking suite
Includes real-world and microbenchmark applications
Evaluates performance, cost, scalability, and deviations
🔎 Similar Papers
No similar papers found.
L
Larissa Schmid
KASTEL, Karlsruhe Institute of Technology, Germany
Marcin Copik
Marcin Copik
ETH Zürich
High-Performance ComputingServerless ComputingPerformance Modeling
A
A. Calotoiu
Department of Computer Science, ETH Zürich, Switzerland
L
Laurin Brandner
Department of Computer Science, ETH Zürich, Switzerland
A
A. Koziolek
KASTEL, Karlsruhe Institute of Technology, Germany
Torsten Hoefler
Torsten Hoefler
Professor of Computer Science at ETH Zurich
High Performance ComputingDeep LearningNetworkingMessage Passing InterfaceParallel and Distributed Computing