STORM: an Unsupervised Connectionist Model for Language Acquisition
Abstract from thesisThe Temporal Cascade sequencer is an unsupervised, recurrent neural network that uses temporal Hebbian learning to enhance its internal pathways to represent commonly occurring sequences in its input data. Once these temporal pathways have been constructed, the TCS will trigger a specific output when it encounters the same sequence of inputs again.
The TCS acts as a temporal pre-processor, by automatically identifying sequences and also higher level sequences-of-sequences within its input data, it is able to present these pre-identified sequences as inputs to a secondary, supervised neural network. This secondary network can then use supervised learning to make a mapping between specific combinations of sequences and relevant outputs. Thus by presenting the secondary network with 'pre-identified' sequences, the TCS dramatically reduces the time required to train the secondary network.
Theoretically the TCS is applicable to almost every area of sequences recognition, including image processing, computer virus detection and especially speech recognition. Its pre-processing abilities could complement many existing static and dynamic neural network applications and significantly reduce training times. However, while the prototype excels at simple sequence detection, more development is needed before its full potential can be exploited to detect higher level sequences-of-sequences. Specifically, this development will involve the chaining of TCS layers together in order to find sequences within sequences. Further research is also required to enhance the interface between the TCS and the secondary network.
Return to Adrian Hopgood's page.
| Adrian Hopgood|