Auditory Robotic Tracking of Sound Sources using Hybrid Cross-Correlation and Recurrent Network
International Conference on Intelligent Robots and Systems,
pages 3554--3559,
doi: 10.1109/IROS.2005.1545093
- Sep 2005
This paper describes an auditory robotic system
capable of computing the angle of incidence of a sound source on
the horizontal plane (azimuth). The system, with the use of an
Elman type recurrent neural network (RNN), is able to dynamically track this sound source as it changes azimuthally within the
environment. The RNN is used to enable fast tracking responses
to the overall system over a set time, as opposed to waiting for
the next sound position before moving. The system is first tested
in a simulated environment and then these results are compared
with testing on the robotic system. The results show that the
development of a hybrid system incorporating cross-correlation
and recurrent neural networks is an effective mechanism for the
control of a robot that tracks sound sources azimuthally.
@InProceedings{MEW05a, author = {Murray, John C. and Erwin, Harry and Wermter, Stefan}, title = {Auditory Robotic Tracking of Sound Sources using Hybrid Cross-Correlation and Recurrent Network}, booktitle = {International Conference on Intelligent Robots and Systems}, editors = {}, number = {}, volume = {}, pages = {3554--3559}, year = {2005}, month = {Sep}, publisher = {IEEE}, doi = {10.1109/IROS.2005.1545093}, }