Crossmodal Language Grounding in an Embodied Neurocognitive Model
Human infants are able to acquire natural language seemingly easily at an early age. Their language learning seems to occur simultaneously with learning other cognitive functions as well as with playful interactions with the environment and caregivers. From a neuroscientific perspective, natural language is embodied, grounded in most, if not all, sensory and sensorimotor modalities, and acquired by means of crossmodal integration. However, characterising the underlying mechanisms in the brain is difficult and explaining the grounding of language in crossmodal perception and action remains challenging. In this paper, we present a neurocognitive model for language grounding which reflects bio-inspired mechanisms such as an implicit adaptation of timescales as well as end-to-end multimodal abstraction. It addresses developmental robotic interaction and extends its learning capabilities using larger-scale knowledge-based data. In our scenario, we utilise the humanoid robot NICO in obtaining the EMIL data collection, in which the cognitive robot interacts with objects in a children's playground environment while receiving linguistic labels from a caregiver. The model analysis shows that crossmodally integrated representations are sufficient for acquiring language merely from sensory input through interaction with objects in an environment. The representations self-organise hierarchically and embed temporal and spatial information through composition and decomposition. This model can also provide the basis for further crossmodal integration of perceptually grounded cognitive representations.
@Article{HYHLHKWW20, author = {Heinrich, Stefan and Yao, Yuan and Hinz, Tobias and Liu, Zhiyuan and Hummel, Thomas and Kerzel, Matthias and Weber, Cornelius and Wermter, Stefan}, title = {Crossmodal Language Grounding in an Embodied Neurocognitive Model}, journal = {Frontiers in Neurorobotics}, number = {}, volume = {}, pages = {}, year = {2020}, month = {Oct}, publisher = {}, doi = {10.3389/fnbot.2020.00052}, }