Theodolite: scalability benchmarking of distributed stream processing engines in microservice architectures
Theodolite: scalability benchmarking of distributed stream processing engines in microservice architectures
Distributed stream processing engines are designed with a focus on scalability to process big data volumes in a continuous manner. We present the Theodolite method for benchmarking the scalability of distributed stream processing engines. Core of this method is the definition of use cases that microservices implementing stream processing have to fulfill. For each use case, our method identifies relevant workload dimensions that might affect the scalability of a use case. We propose to design one benchmark per use case and relevant workload dimension. We present a general benchmarking framework, which can be applied to execute the individual benchmarks for a given use case and workload dimension. Our framework executes an implementation of the use case's dataflow architecture for different workloads of the given dimension and various numbers of processing instances. This way, it identifies how resources demand evolves with increasing workloads. Within the scope of this paper, we present 4 identified use cases, derived from processing Industrial Internet of Things data, and 7 corresponding workload dimensions. We provide implementations of 4 benchmarks with Kafka Streams and Apache Flink as well as an implementation of our benchmarking framework to execute scalability benchmarks in cloud environments. We use both for evaluating the Theodolite method and for benchmarking Kafka Streams' and Flink's scalability for different deployment options.
Benchmarking, Microservices, Scalability, Stream processing
Henning, Sören
e09ef4ea-8a2f-4d11-903b-db51d6371fcb
Hasselbring, Wilhelm
ee89c5c9-a900-40b1-82c1-552268cd01bd
9 February 2021
Henning, Sören
e09ef4ea-8a2f-4d11-903b-db51d6371fcb
Hasselbring, Wilhelm
ee89c5c9-a900-40b1-82c1-552268cd01bd
Henning, Sören and Hasselbring, Wilhelm
(2021)
Theodolite: scalability benchmarking of distributed stream processing engines in microservice architectures.
Big Data Research, 25, [100209].
(doi:10.1016/j.bdr.2021.100209).
Abstract
Distributed stream processing engines are designed with a focus on scalability to process big data volumes in a continuous manner. We present the Theodolite method for benchmarking the scalability of distributed stream processing engines. Core of this method is the definition of use cases that microservices implementing stream processing have to fulfill. For each use case, our method identifies relevant workload dimensions that might affect the scalability of a use case. We propose to design one benchmark per use case and relevant workload dimension. We present a general benchmarking framework, which can be applied to execute the individual benchmarks for a given use case and workload dimension. Our framework executes an implementation of the use case's dataflow architecture for different workloads of the given dimension and various numbers of processing instances. This way, it identifies how resources demand evolves with increasing workloads. Within the scope of this paper, we present 4 identified use cases, derived from processing Industrial Internet of Things data, and 7 corresponding workload dimensions. We provide implementations of 4 benchmarks with Kafka Streams and Apache Flink as well as an implementation of our benchmarking framework to execute scalability benchmarks in cloud environments. We use both for evaluating the Theodolite method and for benchmarking Kafka Streams' and Flink's scalability for different deployment options.
This record has no associated files available for download.
More information
Accepted/In Press date: 23 January 2021
e-pub ahead of print date: 4 February 2021
Published date: 9 February 2021
Keywords:
Benchmarking, Microservices, Scalability, Stream processing
Identifiers
Local EPrints ID: 488717
URI: http://eprints.soton.ac.uk/id/eprint/488717
PURE UUID: cab77891-d8c3-404d-87e8-4623ff4fc1b1
Catalogue record
Date deposited: 04 Apr 2024 16:52
Last modified: 10 Apr 2024 02:15
Export record
Altmetrics
Contributors
Author:
Sören Henning
Author:
Wilhelm Hasselbring
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics