About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Publication
ISIT 1993
Conference paper
Huffman algebras for independent random variables
Abstract
The Huffman algorithm was originally devised to construct the minimal expected length instantaneous source code for a random variable. In this paper, we identify the Hardy-Littlewood-Polya inequality as the key step in the proof of the optimality of the Huffman algorithm and use this to provide an unified framework for various applications of the Huffman algorithm. Quantities that are analogous to entropy are identified for these applications. We also consider the case when the weights of the leaves of the tree are independent random variables. A Huffman algebra is defined, which provides conditions under which the Huffman algorithm is optimal for the case of random weights. In particular, it is shown that the 'most balanced' tree is optimal for the case of independent and identically distributed weights for any arrangement increasing Huffman algebra.