Publication
CLOUD 2022
Short paper

AutoDECK: Automated Declarative Performance Evaluation and Tuning Framework on Kubernetes

View publication

Abstract

Containerization and application variety bring many challenges in automating evaluations for performance tuning and comparison among infrastructure choices. Due to the tightly-coupled design of benchmarks and evaluation tools, the present automated tools on Kubernetes are limited to trivial microbenchmarks and cannot be extended to complex cloudnative architectures such as microservices and serverless, which are usually managed by customized operators for setting up workload dependencies. In this paper, we propose AutoDECK, a performance evaluation framework with a fully declarative manner. The proposed framework automates configuring, deploying, evaluating, summarizing, and visualizing the benchmarking workload. It seamlessly integrates mature Kubernetes-native systems and extends multiple functionalities such as tracking the image-build pipeline, and auto-tuning. We present five use cases of evaluations and analysis through various kinds of bench-marks including microbenchmarks and HPC/AI benchmarks. The evaluation results can also differentiate characteristics such as resource usage behavior and parallelism effectiveness between different clusters. Furthermore, the results demonstrate the benefit of integrating an auto-tuning feature in the proposed framework, as shown by the 10% transferred memory bytes in the Sysbench benchmark.

Date

09 Jul 2021

Publication

CLOUD 2022