Publication
IEEE Trans. Inf. Theory
Paper

Fast universal coding with context models

View publication

Abstract

A universal code using context models is constructed, which is optimal in the strong sense that the mean per symbol code length approaches asymptotically the entropy of any data generating Tree Machine at the fastest possible rate. The number of coding operations required is O (n log log n) for a string of length n, which is much less than in the earlier codes using context models. This is made possible by two new results. The first is a theorem stating that the updates required for universality need be done only once for each batch, the length of which grows at nearly exponential rate in the number of batches. The second is a fast implementation of the tree-building part in algorithm context.

Date

Publication

IEEE Trans. Inf. Theory

Authors

Topics

Share