Jehoshua Bruck, Robert Cypher, et al.
SIAM Journal on Computing
The basic processing unit of a neural network is a linear threshold element. It has been known that neural networks can be much more powerful than traditional logic circuits, assuming that each threshold element can be built at a cost comparable to that of and, or, not logic elements. Whereas any logic circuit of polynomial size (in n) that computes the product of two n-bit numbers requires unbounded delay, such computations can be done in a neural network with “constant” delay. We improve some known results by showing that the product of two n-bit numbers and sorting of n n-bit numbers can be computed by a polynomial-size neural network using only 4 and 5 unit delays, respectively. Moreover, the weights of each threshold element in our neural networks require O(log n)-bit (instead of n-bit) accuracy. © 1990, IEEE
Jehoshua Bruck, Robert Cypher, et al.
SIAM Journal on Computing
Mario Blaum, Jehoshua Bruck, et al.
ISIT 1994
Mario Blaum, Jehoshua Bruck
ISIT 1997
Mario Blaum, Jehoshua Bruck
IEEE Trans. Inf. Theory