Towards an Efficient Performance Testing Through Dynamic Workload Adaptation

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Performance testing is a critical task to ensure an acceptable user experience with software systems, especially when there are high numbers of concurrent users. Selecting an appropriate test workload is a challenging and time-consuming process that relies heavily on the testers’ expertise. Not only are workloads application-dependent, but also it is usually unclear how large a workload must be to expose any performance issues that exist in an application. Previous research has proposed to dynamically adapt the test workloads in real-time based on the application behavior. By reducing the need for the trial-and-error test cycles required when using static workloads, dynamic workload adaptation can reduce the effort and expertise needed to carry out performance testing. However, such approaches usually require testers to properly configure several parameters in order to be effective in identifying workload-dependent performance bugs, which may hinder their usability among practitioners. To address this issue, this paper examines the different criteria needed to conduct performance testing efficiently using dynamic workload adaptation. We present the results of comprehensively evaluating one such approach, providing insights into how to tune it properly in order to obtain better outcomes based on different scenarios. We also study the effects of varying its configuration and how this can affect the results obtained.

Original languageEnglish
Title of host publicationTesting Software and Systems - 31st IFIP WG 6.1 International Conference, ICTSS 2019, Proceedings
EditorsChristophe Gaston, Nikolai Kosmatov, Pascale Le Gall
PublisherSpringer
Pages215-233
Number of pages19
ISBN (Print)9783030312794
DOIs
Publication statusPublished - 2019
Externally publishedYes
Event31st IFIP International Conference on Testing Software and Systems, ICTSS 2019 - Paris, France
Duration: 15 Oct 201917 Oct 2019

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume11812 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference31st IFIP International Conference on Testing Software and Systems, ICTSS 2019
Country/TerritoryFrance
CityParis
Period15/10/1917/10/19

Keywords

  • Performance bug
  • Performance testing
  • Software engineering
  • Web systems and applications
  • Workload

Fingerprint

Dive into the research topics of 'Towards an Efficient Performance Testing Through Dynamic Workload Adaptation'. Together they form a unique fingerprint.

Cite this