Towards Computational Modelling of Neural Multimodal Integration Based on the Superior Colliculus Concept

Kiran Ravulakollu , Michael Knowles , Jindong Liu , Stefan Wermter
Innovations in Neural Information Paradigms and Applications, Volume 247, pages 269--291, - 2009
Associated documents :  
. Information processing and responding to sensory input with appropriate actions are among the most important capabilities of the brain and the brain has specific areas that deal with auditory or visual processing. The auditory information is sent first to the cochlea, then to the inferior colliculus area and then later to the auditory cortex where it is further processed so that then eyes, head or both can be turned towards an object or location in response. The visual information is processed in the retina, various subsequent nuclei and then the visual cortex before again actions will be performed. However, how is this information integrated and what is the effect of auditory and visual stimuli arriving at the same time or at different times? Which information is processed when and what are the responses for multimodal stimuli? Multimodal integration is first performed in the Superior Colliculus, located in a subcortical part of the midbrain. In this chapter we will focus on this first level of multimodal integration, outline various approaches of modelling the superior colliculus, and suggest a model of multimodal integration of visual and auditory information.

 

@Article{RKLW09a, 
 	 author =  {Ravulakollu, Kiran and Knowles, Michael and Liu, Jindong and Wermter, Stefan},  
 	 title = {Towards Computational Modelling of Neural Multimodal Integration Based on the Superior Colliculus Concept}, 
 	 journal = {Innovations in Neural Information Paradigms and Applications},
 	 number = {},
 	 volume = {247},
 	 pages = {269--291},
 	 year = {2009},
 	 month = {},
 	 publisher = {Springer},
 	 doi = {}, 
 }