A recurrent neural network for multiple language acquisition: Starting with English and French
Proceedings of the 2015 International Conference on Neural Information Processing Systems (NIPS 2015), Workshop on Cognitive Computation: Integrating Neural and Symbolic Approaches,
Volume 1583,
- Dec 2015
How humans acquire language, and in particular two or more different languages
with the same neural computing substrate, is still an open issue. To address this
issue we suggest to build models that are able to process any language from the
very beginning. Here we propose a developmental and neuro-inspired approach
that processes sentences word by word with no prior knowledge of the semantics
of the words. Our model has no pre-wired structure but only random and learned
connections: it is based on Reservoir Computing. Our previous model has been
implemented in the context of robotic platforms where users could teach basics
of the English language to instruct a robot to perform actions. In this paper, we
add the ability to process infrequent words, so we could keep our vocabulary size
very small while processing natural language sentences. Moreover, we extend this
approach to the French language and demonstrate that the network can learn both
languages at the same time. Even with small corpora the model is able to learn
and generalize in monolingual and bilingual conditions. This approach promises
to be a more practical alternative for small corpora of different languages than
other supervised learning methods relying on big data sets or more hand-crafted
parsers requiring more manual encoding effort.
@InProceedings{HTPDW15, author = {Hinaut, Xavier and Twiefel, Johannes and Petit, Maxime and Dominey, Peter and Wermter, Stefan}, title = {A recurrent neural network for multiple language acquisition: Starting with English and French}, booktitle = {Proceedings of the 2015 International Conference on Neural Information Processing Systems (NIPS 2015), Workshop on Cognitive Computation: Integrating Neural and Symbolic Approaches}, editors = {}, number = {}, volume = {1583}, pages = {}, year = {2015}, month = {Dec}, publisher = {}, doi = {}, }