Publication
Journal of Parallel and Distributed Computing
Paper

Channel allocation under batching and vcr control in video-on-demand systems

View publication

Abstract

In order to guarantee continuous delivery of a video stream in an on-demand video server environment, a collection of resources (referred to as a logical channel) are reserved in advance. To conserve server resources, multiple client requests for the same video can be batched together and served by a single channel. Increasing the window over which all requests for a particular video are batched results in larger savings in server capacity; however, it also increases the reneging probability of a client. A complication introduced by batching is that if a batched client pauses, a new stream (which may not be immediately available) needs to be started when the client resumes. To provide short response time to resume requests, some channels are set aside and are referred to as contingency channels. To further improve resource utilization, even when a nonbatched client pauses, the channel is released and reacquired upon resume. In this paper, we first develop an analytical model that predicts the reneging probability and expected resume delay, and then use this model to optimally allocate channels for batching, on-demand playback, and contingency. The effectiveness of the proposed policy over a scheme with no contingency channels and no batching is also demonstrated. © 1995 Academic Press, Inc.

Date

Publication

Journal of Parallel and Distributed Computing

Authors

Topics

Share