Question Answering with Hierarchical Attention Networks
Proceedings of the International Joint Conference on Neural Networks (IJCNN 2019) ,
doi: 10.1109/IJCNN.2019.8852403
- Jul 2019
We investigate hierarchical attention networks for
the task of question answering. For this purpose, we propose two
different approaches: in the first, a document vector representation is built hierarchically from word-to-sentence level which
is then used to infer the right answer. In the second, pointer
sum attention is utilized to directly infer an answer from the
attention values of the word and sentence representations. We
evaluate our approach on the Childrens Book Test, a cloze-style
question answering dataset, and analyze the generated attention
distributions. Our results show that, although a hierarchical approach does not offer much improvement over a shallow baseline,
it does indeed offer a large performance boost when combining
word and sentence attention with pointer sum attention.
@InProceedings{AHNW19, author = {Alpay, Tayfun and Heinrich, Stefan and Nelskamp, Michael and Wermter, Stefan}, title = {Question Answering with Hierarchical Attention Networks}, booktitle = {Proceedings of the International Joint Conference on Neural Networks (IJCNN 2019) }, editors = {}, number = {}, volume = {}, pages = {}, year = {2019}, month = {Jul}, publisher = {IEEE}, doi = {10.1109/IJCNN.2019.8852403}, }