Neuro-Robotic Haptic Object Classification by Active Exploration on a Novel Dataset
Proceedings of the International Joint Conference on Neural Networks (IJCNN 2019) ,
doi: 10.1109/IJCNN.2019.8852359
- Jul 2019
We present an embodied neural model for haptic
object classification by active haptic exploration with the humanoid robot NICO. When NICOs newly developed robotic
hand closes around an object, multiple sensory readings from
a tactile fingertip sensor, motor positions, and motor currents
are recorded. We created a haptic dataset with 83200 haptic
measurements, based on 100 samples of each of 16 different
objects, every sample containing 52 measurements. First, we
provide an analysis of neural classification models with regard to
isolated haptic sensory channels for object classification. Based on
this, we develop a series of neural models (MLP, CNN, LSTM)
that integrate the haptic sensory channels to classify explored
objects. As an initial baseline, our best model achieves a 66.6%
classification accuracy over 16 objects. We show that this result is
due to the ability of the network to integrate the haptic data both
over time domain and over different haptic sensory channels.
Furthermore, we make the dataset publically available to address
the issue of sparse haptic datasets for machine learning research.
@InProceedings{KSGGW19, author = {Kerzel, Matthias and Strahl, Erik and Gaede, Connor and Gasanov, Emil and Wermter, Stefan}, title = {Neuro-Robotic Haptic Object Classification by Active Exploration on a Novel Dataset}, booktitle = {Proceedings of the International Joint Conference on Neural Networks (IJCNN 2019) }, editors = {}, number = {}, volume = {}, pages = {}, year = {2019}, month = {Jul}, publisher = {}, doi = {10.1109/IJCNN.2019.8852359}, }