Nothing Special   »   [go: up one dir, main page]

Higher-Order Syntactic Attention Network for Longer Sentence Compression

Hidetaka Kamigaito, Katsuhiko Hayashi, Tsutomu Hirao, Masaaki Nagata


Abstract
A sentence compression method using LSTM can generate fluent compressed sentences. However, the performance of this method is significantly degraded when compressing longer sentences since it does not explicitly handle syntactic features. To solve this problem, we propose a higher-order syntactic attention network (HiSAN) that can handle higher-order dependency features as an attention distribution on LSTM hidden states. Furthermore, to avoid the influence of incorrect parse results, we trained HiSAN by maximizing jointly the probability of a correct output with the attention distribution. Experimental results on Google sentence compression dataset showed that our method achieved the best performance on F1 as well as ROUGE-1,2 and L scores, 83.2, 82.9, 75.8 and 82.7, respectively. In human evaluation, our methods also outperformed baseline methods in both readability and informativeness.
Anthology ID:
N18-1155
Volume:
Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long Papers)
Month:
June
Year:
2018
Address:
New Orleans, Louisiana
Editors:
Marilyn Walker, Heng Ji, Amanda Stent
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1716–1726
Language:
URL:
https://aclanthology.org/N18-1155
DOI:
10.18653/v1/N18-1155
Bibkey:
Cite (ACL):
Hidetaka Kamigaito, Katsuhiko Hayashi, Tsutomu Hirao, and Masaaki Nagata. 2018. Higher-Order Syntactic Attention Network for Longer Sentence Compression. In Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long Papers), pages 1716–1726, New Orleans, Louisiana. Association for Computational Linguistics.
Cite (Informal):
Higher-Order Syntactic Attention Network for Longer Sentence Compression (Kamigaito et al., NAACL 2018)
Copy Citation:
PDF:
https://aclanthology.org/N18-1155.pdf
Note:
 N18-1155.Notes.pdf
Data
GoogleSentence Compression