Continuous Time Recurrent Neural Networks for Grammatical Induction

Joseph Chen , Stefan Wermter
Proceedings of the International Conference on Artificial Neural Networks, pages 381--386, doi: 10.1007/978-1-4471-1599-1_56 - Nov 1998
Associated documents :  
In this paper we explore continuous time recurrent networks for grammatical induction. A higher-level generating/processing scheme can be used to tackle the grammar induction problem. Experiments are performed on several types of grammars, including a family of languages known as Tomita languages and a context-free language.The system and the experiments demonstrate that continuous time recurrent networks can learn certain grammatical induction tasks.

 

@InProceedings{CW98, 
 	 author =  {Chen, Joseph and Wermter, Stefan},  
 	 title = {Continuous Time Recurrent Neural Networks for Grammatical Induction}, 
 	 booktitle = {Proceedings of the International Conference on Artificial Neural Networks},
 	 editors = {},
 	 number = {},
 	 volume = {},
 	 pages = {381--386},
 	 year = {1998},
 	 month = {Nov},
 	 publisher = {},
 	 doi = {10.1007/978-1-4471-1599-1_56}, 
 }