Compositional Learning of Human Activities With a Self-Organizing Neural Architecture
Frontiers in Robotics and AI,
Volume 6,
Number 72,
doi: https://doi.org/10.3389/frobt.2019.00072
- Aug 2019
<p>
An important step for assistive systems and robot companions operating in human
environments is to learn the compositionality of human activities, i.e., recognize both
activities and their comprising actions. Most existing approaches address action and
activity recognition as separate tasks, i.e., actions need to be inferred before the activity
labels, and are thus highly sensitive to the correct temporal segmentation of the activity
sequences. In this paper, we present a novel learning approach that jointly learns human
activities on two levels of semantic and temporal complexity: (1) transitive actions such
as reaching and opening, e.g., a cereal box, and (2) high-level activities such as having
breakfast. Our model consists of a hierarchy of GWR networks which process and learn
inherent spatiotemporal dependencies of multiple visual cues extracted from the human
body skeletal representation and the interaction with objects. The neural architecture
learns and semantically segments input RGB-D sequences of high-level activities into
their composing actions, without supervision. We investigate the performance of our
architecture with a set of experiments on a publicly available benchmark dataset. The
experimental results show that our approach outperforms the state of the art with
respect to the classification of the high-level activities. Additionally, we introduce a novel
top-down modulation mechanism to the architecture which uses the actions and activity
labels as constraints during the learning phase. In our experiments, we show how this
mechanism can be used to control the network's neural growth without decreasing the
overall performance.
</p>
@Article{MPW19, author = {Mici, Luiza and Parisi, German I. and Wermter, Stefan}, title = {Compositional Learning of Human Activities With a Self-Organizing Neural Architecture}, journal = {Frontiers in Robotics and AI}, number = {72}, volume = {6}, pages = {}, year = {2019}, month = {Aug}, publisher = {}, doi = {https://doi.org/10.3389/frobt.2019.00072}, }