Conversational Analysis using Utterance-level Attention-based Bidirectional Recurrent Neural Networks
INTERSPEECH 2018,
pages 996--1000,
doi: 10.21437/Interspeech.2018-2527
- Sep 2018
Recent approaches for dialogue act recognition have shown that
context from preceding utterances is important to classify the
subsequent one. It was shown that the performance improves
rapidly when the context is taken into account. We propose
an utterance-level attention-based bidirectional recurrent neural
network (Utt-Att-BiRNN) model to analyze the importance of
preceding utterances to classify the current one. In our setup,
the BiRNN is given the input set of current and preceding ut-
terances. Our model outperforms previous models that use only
preceding utterances as context on the used corpus. Another
contribution of our research is a mechanism to discover the
amount of information in each utterance to classify the subse-
quent one and to show that context-based learning not only im-
proves the performance but also achieves higher confidence in
the recognition of dialogue acts. We use character- and word-
level features to represent the utterances. The results are pre-
sented for character and word feature representations and as an
ensemble model of both representations. We found that when
classifying short utterances, the closest preceding utterances
contribute to a higher degree.
@InProceedings{BMWW18, author = {Bothe, Chandrakant and Magg, Sven and Weber, Cornelius and Wermter, Stefan}, title = {Conversational Analysis using Utterance-level Attention-based Bidirectional Recurrent Neural Networks}, booktitle = {INTERSPEECH 2018}, editors = {}, number = {}, volume = {}, pages = {996--1000}, year = {2018}, month = {Sep}, publisher = {ISCA}, doi = {10.21437/Interspeech.2018-2527}, }