Publication
CLOUD 2012
Conference paper

Scheduling parallel tasks onto opportunistically available cloud resources

View publication

Abstract

We consider the problem of opportunistically scheduling low-priority tasks onto underutilized computation resources in the cloud left by high-priority tasks. To avoid conflicts with high-priority tasks, the scheduler must suspend the low-priority tasks (causing waiting), or move them to other underutilized servers (causing migration), if the high-priority tasks resume. The goal of opportunistic scheduling is to schedule the low-priority tasks onto intermittently available server resources while minimizing the combined cost of waiting and migration. Moreover, we aim to support multiple parallel low-priority tasks with synchronization constraints. Under the assumption that servers' availability to low-priority tasks can be modeled as ON/OFF Markov chains, we have shown that the optimal solution requires solving a Markov Decision Process (MDP) that has exponential complexity, and efficient solutions are known only in the case of homogeneously behaving servers. In this paper, we propose an efficient heuristic scheduling policy by formulating the problem as restless Multi-Armed Bandits (MAB) under relaxed synchronization. We prove the index ability of the problem and provide closed-form formulas to compute the indices. Our evaluation using real data center traces shows that the performance result closely matches the prediction by the Markov chain model, and the proposed index policy achieves consistently good performance under various server dynamics compared with the existing policies. © 2012 IEEE.

Date

02 Oct 2012

Publication

CLOUD 2012

Authors

Share