Publication
IISWC 2006
Conference paper

The dynamics of backfilling: Solving the mystery of why increased inaccuracy may help

View publication

Abstract

Parallel job scheduling with backfilling requires users to provide runtime estimates, used by the scheduler to better pack the jobs. Studies of the impact of such estimates on performance have modeled them using a "badness factor" f ≥ 0 in an attempt to capture their inaccuracy (given a runtime r, the estimate is uniformly distributed in [r, (f + 1) · r]). Surprisingly, inaccurate estimates (f > 0) yielded better performance than accurate ones (f = 0). We explain this by a "heel and toe" dynamics that, with f > 0, cause backfilling to approximate shortest-job first scheduling. We further find the effect of systematically increasing f is V-shaped: average wait time and slowdown initially drop, only to rise later on. This happens because higher fs create bigger "holes" in the schedule (longer jobs can backfill) and increase the randomness (more long jobs appear as short), thus overshadowing the initial heel-and-toe preference for shorter jobs. The bottom line is that artificial inaccuracy generated by multiplying (real or perfect) estimates by a factor is (1) just a scheduling technique that trades off fairness for performance, and is (2) ill-suited for studying the effect of real inaccuracy. Real estimates are modal (90% of the jobs use the same 20 estimates) and bounded by a maximum (usually the most popular estimate). Therefore, when performing an evaluation, "increased inaccuracy" should translate to increased modality. Unlike multiplying, this indeed worsens performance as one would intuitively expect. ©2006 IEEE.

Date

Publication

IISWC 2006

Authors

Share