Curiosity-Driven Exploration Enhances Motor Skills of Continuous Actor-Critic Learner
Proceedings of the 7th Joint IEEE International Conference on Development and Learning and on Epigenetic Robotics (ICDL-EpiRob),
pages 39--46,
doi: 10.1109/DEVLRN.2017.8329785
- Sep 2017
Guiding the action selection mechanism of an
autonomous agent for learning control behaviors is a crucial
issue in reinforcement learning. While classical approaches to
reinforcement learning seem to be deeply dependent on external
feedback, intrinsically motivated approaches are more natural
and follow the principles of infant sensorimotor development. In
this work, we investigate the role of incremental learning of
predictive models in generating curiosity, an intrinsic motivation, for directing the agents choice of action and propose a
curiosity-driven reinforcement learning algorithm for
continuous motor control. Our algorithm builds an internal
representation of the state space that handles the computation of
curiosity signals using the learned predictive models and extends
the Continuous-Actor-Critic-Learning-Automaton to use extrinsic and intrinsic feedback. Evaluation of our algorithm on
simple and complex robotic control tasks shows a significant
performance gain for the intrinsically motivated goal reaching
agent compared to agents that are only motivated by extrinsic
rewards.
@InProceedings{HWW17, author = {Hafez, Burhan and Weber, Cornelius and Wermter, Stefan}, title = {Curiosity-Driven Exploration Enhances Motor Skills of Continuous Actor-Critic Learner}, booktitle = {Proceedings of the 7th Joint IEEE International Conference on Development and Learning and on Epigenetic Robotics (ICDL-EpiRob)}, editors = {}, number = {}, volume = {}, pages = {39--46}, year = {2017}, month = {Sep}, publisher = {}, doi = {10.1109/DEVLRN.2017.8329785}, }