Effect of Pruning on Catastrophic Forgetting in Growing Dual Memory Networks

Wei Shiung Liew , Chu Kiong Loo , Vadym Gryshchuk , Cornelius Weber , Stefan Wermter
Proceedings of the International Joint Conference on Neural Networks (IJCNN 2019), doi: 10.1109/IJCNN.2019.8851865 - Jul 2019
Associated documents :  
Grow-when-required networks such as the Growing Dual-Memory (GDM) networks possess a dynamic network structure, expanding to accommodate new neurons in response to learning novel concepts. Over time, it may be necessary to prune obsolete neurons and/or neural connections to meet performance or resource limitations. GDM networks utilize an age-based pruning strategy, whereby older neurons and neural connections that have not been activated recently are removed. Catastrophic forgetting occurs when knowledge learned by the networks in previous learning iterations is lost due to being overwritten by newer learning iterations, or to the pruning process. In this work, we investigate catastrophic forgetting in GDM networks in response to different pruning strategies. The age-based pruning method was shown to significantly sparsify the GDM network topology while improving the networks ability to recall newly acquired concepts with a slight decrease in performance with respect to older knowledge. A significance-based pruning method was tested as a replacement for the age-based pruning, but was not as effective at pruning even though it performed better at recalling older knowledge.

 

@InProceedings{LLGWW19, 
 	 author =  {Liew, Wei Shiung and Loo, Chu Kiong and Gryshchuk, Vadym and Weber, Cornelius and Wermter, Stefan},  
 	 title = {Effect of Pruning on Catastrophic Forgetting in Growing Dual Memory Networks}, 
 	 booktitle = {Proceedings of the International Joint Conference on Neural Networks (IJCNN 2019)},
 	 editors = {},
 	 number = {},
 	 volume = {},
 	 pages = {},
 	 year = {2019},
 	 month = {Jul},
 	 publisher = {},
 	 doi = {10.1109/IJCNN.2019.8851865}, 
 }