Finding structure in time (01 Jan 1990)
This article introduces the 'simple recurrent network' (SRN) archecture, which has been widely applied in problems involving serially ordered patterns. Traditional feedfoward networks learn to map static inputs to outputs. However, there are many phenomena in which time figures as a critical dimension, and for which recurrent networks such as the SRN are useful. In the SRN, internal states ("hidden units") feed back on to themselves at successive time steps. These recurrent connections provide the network with a memory that can be used for solving problems in which there is temporal structure. A number of simulations are reported in this paper in which the SRN is trained on a prediction task. In the course of learning to predict various time series, the network learns things such as the implicit word boundaries between letter strings, or the semantic and syntactic categories that underlie an artificial grammar.
Article URL: http://cognitrn.psych.indiana.edu/rgoldsto/cogsci/Larkin.pdf