Methods for Integrating Memory into Neural Networks in Condition Monitoring

J. F. Dale Addison , Stefan Wermter , Ken McGarry , J. MacIntyre
Proceedings of the International Conference on Artificial Intelligence and Soft Computing, pages 380--384, - Jul 2002
Associated documents :  
A criticism of neural network architectures is their susceptibility to “catastrophic interference” the ability to forget previously learned data when presented with new patterns. To avoid this, neural network architectures have been developed which specifically provide the network with a memory, either through the use of a context unit, which can store patterns for later recall, or which combine high-levels of recurrency coupled with some form of backpropagation. We have evaluated two architectures which utilise these concepts, namely, Hopfield and Elman networks, respectively and compared their performance to self-organising feature maps using time- smoothed moving average data and Time delayed neural networks. Our results indicate clear improvements in performance for networks incorporating memory into their structure. However the degree of improvement depends largely upon the architecture used, and the provision of a context layer for the storage and recall of patterns.

 

@InProceedings{AWMM02, 
 	 author =  {Addison, J. F. Dale and Wermter, Stefan and McGarry, Ken and MacIntyre, J.},  
 	 title = {Methods for Integrating Memory into Neural Networks in Condition Monitoring}, 
 	 booktitle = {Proceedings of the International Conference on Artificial Intelligence and Soft Computing},
 	 editors = {},
 	 number = {},
 	 volume = {},
 	 pages = {380--384},
 	 year = {2002},
 	 month = {Jul},
 	 publisher = {Springer},
 	 doi = {}, 
 }