Hear the Egg — Demonstrating Robotic Interactive Auditory Perception

Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 2018 doi: 10.1109/IROS.2018.8593959 - Oct 2018
Associated documents :  
We present an illustrative example of an interactive auditory perception approach performed by a humanoid robot called NICO, the Neuro Inspired COmpanion [1]. The video demonstrates a material classification task in the style of a classic TV game show. NICO and another candidate are supposed to determine the content of small plastic capsules that are visually indistinguishable. Shaking the capsules produces audio signals that range from rattling stones, over tinkling coins to swooshing sand. NICO can perceive and analyze these sounds to determine the material of the capsule’s content. Though presented in an edutainment context, the ability to distinguish objects and their content by eliciting and classifying audio signals is not only helpful but in fact necessary for a humanoid companion robot that assists humans in an unstructured domestic environment. By using interactive auditory perception [2], a robot can detect if a salt shaker is empty or the box of biscuits really contains cookies without opening it for visual inspection.

 

@InProceedings{SKEGW18, 
 	 author =  {Strahl, Erik and Kerzel, Matthias and Eppe, Manfred and Griffiths, Sascha and Wermter, Stefan},  
 	 title = {Hear the Egg — Demonstrating Robotic Interactive Auditory Perception}, 
 	 booktitle = {Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 2018},
 	 number = {},
 	 volume = {},
 	 pages = {},
 	 year = {2018},
 	 month = {Oct},
 	 publisher = {},
 	 doi = {10.1109/IROS.2018.8593959}, 
 }