You are here: TUCS > PUBLICATIONS > Publication Search > A Recurrent Neural Model with ...
A Recurrent Neural Model with Attention for the Recognition of Chinese Implicit Discourse Relations
Samuel Rönnqvist, Niko Schenk, Christian Chiarcos, A Recurrent Neural Model with Attention for the Recognition of Chinese Implicit Discourse Relations. In: Regina Barzilay, Min-Yen Kan (Eds.), Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics, 2, 256–262, Association for Computational Linguistics, 2017.
http://dx.doi.org/10.18653/v1/P17-2040
Abstract:
We introduce an attention-based Bi-LSTM for Chinese implicit discourse relations and demonstrate that modeling argument pairs as a joint sequence can outperform word order-agnostic approaches. Our model benefits from a partial sampling scheme and is conceptually simple, yet achieves state-of-the-art performance on the Chinese Discourse Treebank. We also visualize its attention activity to illustrate the model’s ability to selectively focus on the relevant parts of an input sequence
BibTeX entry:
@INPROCEEDINGS{inpRxScCh17a,
title = {A Recurrent Neural Model with Attention for the Recognition of Chinese Implicit Discourse Relations},
booktitle = {Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics},
author = {Rönnqvist, Samuel and Schenk, Niko and Chiarcos, Christian},
volume = {2},
editor = {Barzilay, Regina and Kan, Min-Yen},
publisher = {Association for Computational Linguistics},
pages = {256–262},
year = {2017},
keywords = {deep learning, neural networks, discourse parsing, chinese, attention},
ISSN = {0736-587X},
}
Belongs to TUCS Research Unit(s): Data Mining and Knowledge Management Laboratory
Publication Forum rating of this publication: level 2