About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Publication
Mathematical Systems Theory
Paper
Information theory and the complexity of boolean functions
Abstract
This paper explores the connections between two areas pioneered by Shannon: the transmission of information with a fidelity criterion, and the realization of Boolean functions by networks and formulae. We study three phenomena: 1. The effect of the relative number of O's and l's in a function's table on its complexity. 2. The effect of the number of unspecified entries in a partially specified function's table on its complexity. 3. The effect of the number of errors allowed in the realization of a function on its complexity. Our main result is a precise version of the following statement: The complexity of approximately realizing a partially specified Boolean function, in whose table a fraction d of the entries are unspecified and a fraction p of the specified entries are l'swith errors allowed in a fraction not more than e of the specified entries, is less by the factor (1 -d) [H(p) - H(e)] (where H(z) = -z log2z -(1 -z) log2 (1 -z) is the binary entropy function) than the complexity of exactly realizing an arbitrary fully specified Boolean function. We also give an intuitively appealing information-theoretic interpretation of the result. © 1977 Springer-Verlag New York Inc.