Human Impression of Humanoid Robots Mirroring Social Cues

2024 ACM/IEEE International Conference on Human-Robot Interaction (HRI'24 Companion) pages 458–462, doi: 10.1145/3610978.3640580 - Mar 2024 Open Access
Associated documents :  
Mirroring non-verbal social cues such as affect or movement can enhance human-human and human-robot interactions in the real world. The robotic platforms and control methods also impact people's perception of human-robot interaction. However, limited studies have compared robot imitation across different platforms and control methods. Our research addresses this gap by conducting two experiments comparing people's perception of affective mirroring between the iCub and Pepper robots and movement mirroring between vision-based iCub control and Inertial Measurement Unit (IMU)-based iCub control. We discovered that the iCub robot was perceived as more humanlike than the Pepper robot when mirroring affect. A vision-based controlled iCub outperformed the IMU-based controlled one in the movement mirroring task. Our findings suggest that different robotic platforms impact people's perception of robots' mirroring during HRI. The control method also contributes to the robot's mirroring performance. Our work sheds light on the design and application of different humanoid robots in the real world.

 

@InProceedings{FAAW24, 
 	 author =  {Fu, Di and Abawi, Fares and Allgeuer, Philipp and Wermter, Stefan},  
 	 title = {Human Impression of Humanoid Robots Mirroring Social Cues}, 
 	 booktitle = {2024 ACM/IEEE International Conference on Human-Robot Interaction (HRI'24 Companion)},
 	 journal = {},
 	 number = {},
 	 volume = {},
 	 pages = {458–462},
 	 year = {2024},
 	 month = {Mar},
 	 publisher = {},
 	 doi = {10.1145/3610978.3640580}, 
 }