Publication
NeurIPS 2020
Workshop paper

Orthogonal Laguerre Recurrent Neural Networks

Download paper

Abstract

The inability of RNN architectures to incorporate stable discrete time linear time invariant dynamics has been a long standing problem which has also affected their performance in practice. To mitigate this problem, in this paper, we propose a RNN architecture which embeds a linear time-invariant system basis comprising of Laguerre polynomials inside its structure. Laguerre functions are a family of Eigen functions arising from the Sturm-Liouville problem characterized by their orthonormality. The embedded full state space representation provided by such orthonormal functions is used by our RNN in two different variants: a general orthogonal Laguerre network and a Ladder network. The state of such systems is used as a means to preserve input information for longer periods of time by an orthogonal encoding. Both variants are enhanced by a non-linear static output layer which projects the input, the state, the memory and the past output for the hidden state. The proposed models are benchmarked exhaustively against other dynamical systems inspired RNN's on two physics-based benchmarks which demonstrate their better performance

Date

06 Dec 2020

Publication

NeurIPS 2020