Publication
COMPCON 1995
Conference paper

Buffering and caching in large-scale video servers

View publication

Abstract

Video-on-demand servers are characterized by stringent real-time constraints, as each stream requires isochronous data playout. The capacity of the system depends on the acceptable jitter per stream (the number of data blocks that do not meet their real-time constraints). Per-stream read-ahead buffering avoids the disruption in playback caused by variations in disk access time and queuing delays. With heavily skewed access patterns to the stored video data, the system is often disk arm-bound. In such cases, serving video streams from a memory cache can result in a substantial reduction in server cost. In this paper, we study the cost-performance trade-offs of various buffering and caching strategies that can be used in a large-scale video server. We first study the cost impact of varying the buffer size, disk utilization and the disk characteristics on the overall capacity of the system. Subsequently, we study the cost-effectiveness of a technique for memory caching across streams that exploits temporal locality and workload fluctuations.

Date

Publication

COMPCON 1995

Authors

Share