A Humanoid Robot Learning Audiovisual Classification By Active Exploration
2021 IEEE International Conference on Development and Learning (ICDL),
pages 1--6,
doi: 10.1109/ICDL49984.2021.9515598
- Aug 2021
<p>We present a novel neurorobotic setup and dataset for active object exploration and audiovisual classification based on their material properties. In the robotic setup, a humanoid drops an item on a sloped surface and records the video image frames and raw audio of the collision of the surface and object. The novel dataset includes 32800 images and 1600 s of audio recording from 800 samples for 16 objects and will be made publicly available. We propose a novel neural architecture for the classification of the objects. A detailed analysis of results shows that different materials are easier classified either in the audio or the visual modality. As a main contribution, we can show that combining modalities can achieve an even higher classification accuracy of 90%.</p>
@InProceedings{MKSW21, author = {Mir, Glareh and Kerzel, Matthias and Strahl, Erik and Wermter, Stefan}, title = {A Humanoid Robot Learning Audiovisual Classification By Active Exploration}, booktitle = {2021 IEEE International Conference on Development and Learning (ICDL)}, editors = {}, number = {}, volume = {}, pages = {1--6}, year = {2021}, month = {Aug}, publisher = {IEEE}, doi = {10.1109/ICDL49984.2021.9515598}, }