A HYPERMAP APPROACH TO MULTIPLE SEQUENCE PROCESSING

A. Nyamapfene

Keywords

Sequence processing, temporal hypermap, gated multi-net, childlanguage, neural network

Abstract

A self-organizing neural network that learns and recalls multiple sequences is presented. Each of the sequences may have recurring items, and several sequences may have one or more common items. The self-organizing temporal network stores each sequence independently of other sequences, and the learning of a sequence does not necessitate a retraining of the network on previously learnt sequences. An experimental assessment using a benchmark set of sequences suggests that the network recalls stored sequences in their intact form when presented with their sequence identity vectors or when presented with their constituent subsequences. The utility of the proposed temporal network, which also exhibits multimodality, is demonstrated by incorporating the network into a gated multinet model of early child language development.

Important Links:

Go Back