Methods for Integrating Memory into Neural Networks in Condition Monitoring

J.F.D. Addison, S. Wermter, K.J. McGarry, and J. MacIntyre (UK)

Keywords

Neural networks, memory systems, recurrency

Abstract

A criticism of neural network architectures is their susceptibility to “catastrophic interference” the ability to forget previously learned data when presented with new patterns. To avoid this, neural network architectures have been developed which specifically provide the network with a memory, either through the use of a context unit, which can store patterns for later recall, or which combine high-levels of recurrency coupled with some form of back-propagation. We have evaluated two architectures which utilise these concepts, namely, Hopfield and Elman networks, respectively and compared there performance to self-organising feature maps using time- smoothed moving average data and Time delayed neural networks. Our results indicate clear improvements in performance for networks incorporating memory into their structure. However the degree of improvement depends largely upon the architecture used, and the provision of a context layer for the storage and recall of patterns.

Important Links:



Go Back