About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Publication
MWSCAS 1990
Conference paper
Analysis and synthesis of neural networks using linear separation
Abstract
General analysis and synthesis methods for neural networks are presented. The techniques proposed are simple, efficient and not restricted to a certain network architecture, i.e., they can be any of the multilayer, fully interconnected feedforward or feedback structures. Based on the signs of connections between neurons (called weight signatures) being excitatory or inhibitory, the methods proposed provide some fundamental rules of learnability in such networks. Various design techniques are presented using these learning rules for the synthesis of neural architectures.