Publication
SEC 2019
Conference paper

DeFog: Fog computing benchmarks

View publication

Abstract

There are currently no benchmarks that can directly compare the performance of an application across the cloud-only, edge-only and cloud-edge (Fog) deployment platforms to obtain any insight on potential performance improvement. This paper proposes DeFog, a frst Fog benchmarking suite to: (i) alleviate the burden of Fog benchmarking by using a standard methodology, and (ii) facilitate the understanding of the target platform by collecting a catalogue of relevant metrics for a set of benchmarks. The current portfolio of DeFog benchmarks comprises six relevant applications conducive to using the edge. Experimental studies are carried out on multiple target platforms to demonstrate the use of DeFog for collecting metrics related to application latencies (communication and computation), for understanding the impact of stress and concurrent users on application latencies, and for understanding the performance of deploying diferent combination of services of an application across the cloud and edge. DeFog is available for public download (https://github.com/qub-blesson/DeFog).

Date

07 Nov 2019

Publication

SEC 2019

Share