Deep Neural Object Analysis by Interactive Auditory Exploration with a Humanoid Robot
IEEE/RSJ International Conference on Intelligent Robots and Systems,
doi: 10.1109/IROS.2018.8593838
- Oct 2018
We present a novel approach for interactive auditory object analysis with a humanoid robot. The robot elicits
sensory information by physically shaking visually indistinguishable plastic capsules. It gathers the resulting audio signals
from microphones that are embedded into the robotic ears.
A neural network architecture learns from these signals to
analyze properties of the contents of the containers. Specifically,
we evaluate the material classification and weight prediction
accuracy and demonstrate that the framework is fairly robust
to acoustic real-world noise.
@InProceedings{EKSW18, author = {Eppe, Manfred and Kerzel, Matthias and Strahl, Erik and Wermter, Stefan}, title = {Deep Neural Object Analysis by Interactive Auditory Exploration with a Humanoid Robot}, booktitle = {IEEE/RSJ International Conference on Intelligent Robots and Systems}, editors = {}, number = {}, volume = {}, pages = {}, year = {2018}, month = {Oct}, publisher = {}, doi = {10.1109/IROS.2018.8593838}, }