M. Tismenetsky
International Journal of Computer Mathematics
A boundary layer method for accelerating the solution of the differential equations representing the dynamics of an analog relaxation neural net in a high gain limit is presented. The inverse of the gain parameter in an analog neuron's transfer function is used as a small parameter, in terms of which the net dynamics may be separated into two time scales. This separation leads to economies in the numerical treatment of the associated differential equations, i.e., the acceleration in question. Illustrative computations are presented. © 1993.
M. Tismenetsky
International Journal of Computer Mathematics
F. Odeh, I. Tadjbakhsh
Archive for Rational Mechanics and Analysis
Hang-Yip Liu, Steffen Schulze, et al.
Proceedings of SPIE - The International Society for Optical Engineering
David Cash, Dennis Hofheinz, et al.
Journal of Cryptology