A recurrent neural model with attention for the recognition of Chinese implicit discourse relations

  • We introduce an attention-based Bi-LSTM for Chinese implicit discourse relations and demonstrate that modeling argument pairs as a joint sequence can outperform word order-agnostic approaches. Our model benefits from a partial sampling scheme and is conceptually simple, yet achieves state-of-the-art performance on the Chinese Discourse Treebank. We also visualize its attention activity to illustrate the model’s ability to selectively focus on the relevant parts of an input sequence.

Download full text files

Export metadata

Statistics

Number of document requests

Additional Services

Share in Twitter Search Google Scholar
Metadaten
Author:Samuel Rönnqvist, Niko Schenk, Christian ChiarcosORCiDGND
URN:urn:nbn:de:bvb:384-opus4-1041110
Frontdoor URLhttps://opus.bibliothek.uni-augsburg.de/opus4/104111
ISBN:978-1-945626-76-0OPAC
Parent Title (English):Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), July 30 - August 4, 2017, Vancouver, Canada
Publisher:Association for Computational Linguistics
Place of publication:Stroudsburg, PA
Editor:Chris Callison-Burch, Regina Barzilay, Min-Yen Kan, Wei Lu, Sameer Singh, Margaret Mitchell
Type:Conference Proceeding
Language:English
Year of first Publication:2017
Publishing Institution:Universität Augsburg
Release Date:2023/05/10
First Page:256
Last Page:262
DOI:https://doi.org/10.18653/v1/P17-2040
Institutes:Philologisch-Historische Fakultät
Philologisch-Historische Fakultät / Angewandte Computerlinguistik
Philologisch-Historische Fakultät / Angewandte Computerlinguistik / Lehrstuhl für Angewandte Computerlinguistik (ACoLi)
Dewey Decimal Classification:6 Technik, Medizin, angewandte Wissenschaften / 61 Medizin und Gesundheit / 610 Medizin und Gesundheit
Licence (German):CC-BY 4.0: Creative Commons: Namensnennung (mit Print on Demand)