Symbolic State Transducers and Recurrent Neural Preference Machines for Text Mining

Garen Arevian , Stefan Wermter , Christo Panchev
International Journal on Approximate Reasoning, Volume 32, Number 2--3, pages 237--258, doi: 10.1016/S0888-613X(02)00085-3 - Feb 2003
Associated documents :  
This paper focuses on symbolic transducers and recurrent neural preference machines to support the task of mining and classifying textual information. These encoding symbolic transducers and learning neural preference machines can be seen as independent agents, each one tackling the same task in a different manner. Systems combining such machines can potentially be more robust as the strengths and weaknesses of the different approaches yield complementary knowledge, wherein each machine models the same information content via different paradigms. An experimental analysis of the performance of these symbolic transducer and neural preference machines is presented. It is demonstrated that each approach can be successfully used for information mining and news classification using the Reuters news corpus. Symbolic transducer machines can be used to manually encode relevant knowledge quickly in a data-driven approach with no training, while trained neural preference machines can give better performance based on additional training.

 

@Article{AWP03, 
 	 author =  {Arevian, Garen and Wermter, Stefan and Panchev, Christo},  
 	 title = {Symbolic State Transducers and Recurrent Neural Preference Machines for Text Mining}, 
 	 journal = {International Journal on Approximate Reasoning},
 	 number = {2--3},
 	 volume = {32},
 	 pages = {237--258},
 	 year = {2003},
 	 month = {Feb},
 	 publisher = {Elsevier},
 	 doi = {10.1016/S0888-613X(02)00085-3}, 
 }