Publication
Applied Mathematics Letters
Paper
Accelerating neural net dynamics by boundary layer methods
Abstract
A boundary layer method for accelerating the solution of the differential equations representing the dynamics of an analog relaxation neural net in a high gain limit is presented. The inverse of the gain parameter in an analog neuron's transfer function is used as a small parameter, in terms of which the net dynamics may be separated into two time scales. This separation leads to economies in the numerical treatment of the associated differential equations, i.e., the acceleration in question. Illustrative computations are presented. © 1993.