Nothing Special   »   [go: up one dir, main page]

Tag-Enhanced Tree-Structured Neural Networks for Implicit Discourse Relation Classification

Yizhong Wang, Sujian Li, Jingfeng Yang, Xu Sun, Houfeng Wang


Abstract
Identifying implicit discourse relations between text spans is a challenging task because it requires understanding the meaning of the text. To tackle this task, recent studies have tried several deep learning methods but few of them exploited the syntactic information. In this work, we explore the idea of incorporating syntactic parse tree into neural networks. Specifically, we employ the Tree-LSTM model and Tree-GRU model, which is based on the tree structure, to encode the arguments in a relation. And we further leverage the constituent tags to control the semantic composition process in these tree-structured neural networks. Experimental results show that our method achieves state-of-the-art performance on PDTB corpus.
Anthology ID:
I17-1050
Volume:
Proceedings of the Eighth International Joint Conference on Natural Language Processing (Volume 1: Long Papers)
Month:
November
Year:
2017
Address:
Taipei, Taiwan
Editors:
Greg Kondrak, Taro Watanabe
Venue:
IJCNLP
SIG:
Publisher:
Asian Federation of Natural Language Processing
Note:
Pages:
496–505
Language:
URL:
https://aclanthology.org/I17-1050
DOI:
Bibkey:
Cite (ACL):
Yizhong Wang, Sujian Li, Jingfeng Yang, Xu Sun, and Houfeng Wang. 2017. Tag-Enhanced Tree-Structured Neural Networks for Implicit Discourse Relation Classification. In Proceedings of the Eighth International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pages 496–505, Taipei, Taiwan. Asian Federation of Natural Language Processing.
Cite (Informal):
Tag-Enhanced Tree-Structured Neural Networks for Implicit Discourse Relation Classification (Wang et al., IJCNLP 2017)
Copy Citation:
PDF:
https://aclanthology.org/I17-1050.pdf