The University of Southampton
University of Southampton Institutional Repository

Automated identification of performance changes at code level

Automated identification of performance changes at code level
Automated identification of performance changes at code level

To develop software with optimal performance, even small performance changes need to be identified. Identifying performance changes is challenging since the performance of software is influenced by non-deterministic factors. Therefore, not every performance change is measurable with reasonable effort. In this work, we discuss which performance changes are measurable at code level with reasonable measurement effort and how to identify them. We present (1) an analysis of the boundaries of measuring performance changes, (2) an approach for determining a configuration for reproducible performance change identification, and (3) an evaluation comparing of how well our approach is able to identify performance changes in the application server Jetty compared with the usage of Jetty's own performance regression benchmarks.Thereby, we find (1) that small performance differences are only measurable by fine-grained measurement workloads, (2) that performance changes caused by the change of one operation can be identified using a unit-test-sized workload definition and a suitable configuration, and (3) that using our approach identifies small performance regressions more efficiently than using Jetty's performance regression benchmarks.

benchmarking, performance measurement, software performance engineering
916-925
IEEE
Reichelt, David Georg
5fb209f3-c0f3-452b-92a5-ebde43a49ce0
Kühne, Stefan
1a264da8-4731-430a-bbca-83ec4e404db5
Hasselbring, Wilhelm
ee89c5c9-a900-40b1-82c1-552268cd01bd
Reichelt, David Georg
5fb209f3-c0f3-452b-92a5-ebde43a49ce0
Kühne, Stefan
1a264da8-4731-430a-bbca-83ec4e404db5
Hasselbring, Wilhelm
ee89c5c9-a900-40b1-82c1-552268cd01bd

Reichelt, David Georg, Kühne, Stefan and Hasselbring, Wilhelm (2023) Automated identification of performance changes at code level. In 2022 IEEE 22nd International Conference on Software Quality, Reliability and Security (QRS). IEEE. pp. 916-925 . (doi:10.1109/QRS57517.2022.00096).

Record type: Conference or Workshop Item (Paper)

Abstract

To develop software with optimal performance, even small performance changes need to be identified. Identifying performance changes is challenging since the performance of software is influenced by non-deterministic factors. Therefore, not every performance change is measurable with reasonable effort. In this work, we discuss which performance changes are measurable at code level with reasonable measurement effort and how to identify them. We present (1) an analysis of the boundaries of measuring performance changes, (2) an approach for determining a configuration for reproducible performance change identification, and (3) an evaluation comparing of how well our approach is able to identify performance changes in the application server Jetty compared with the usage of Jetty's own performance regression benchmarks.Thereby, we find (1) that small performance differences are only measurable by fine-grained measurement workloads, (2) that performance changes caused by the change of one operation can be identified using a unit-test-sized workload definition and a suitable configuration, and (3) that using our approach identifies small performance regressions more efficiently than using Jetty's performance regression benchmarks.

This record has no associated files available for download.

More information

e-pub ahead of print date: 20 March 2023
Venue - Dates: 22nd IEEE International Conference on Software Quality, Reliability and Security, QRS 2022, , Virtual, Online, China, 2022-12-05 - 2022-12-09
Keywords: benchmarking, performance measurement, software performance engineering

Identifiers

Local EPrints ID: 488780
URI: http://eprints.soton.ac.uk/id/eprint/488780
PURE UUID: 9b37226a-3960-4aad-b0f1-c886010253d2
ORCID for Wilhelm Hasselbring: ORCID iD orcid.org/0000-0001-6625-4335

Catalogue record

Date deposited: 05 Apr 2024 16:38
Last modified: 10 Apr 2024 02:15

Export record

Altmetrics

Contributors

Author: David Georg Reichelt
Author: Stefan Kühne
Author: Wilhelm Hasselbring ORCID iD

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×