Nothing Special   »   [go: up one dir, main page]

Phrase-level Self-Attention Networks for Universal Sentence Encoding

Wei Wu, Houfeng Wang, Tianyu Liu, Shuming Ma


Abstract
Universal sentence encoding is a hot topic in recent NLP research. Attention mechanism has been an integral part in many sentence encoding models, allowing the models to capture context dependencies regardless of the distance between the elements in the sequence. Fully attention-based models have recently attracted enormous interest due to their highly parallelizable computation and significantly less training time. However, the memory consumption of their models grows quadratically with the sentence length, and the syntactic information is neglected. To this end, we propose Phrase-level Self-Attention Networks (PSAN) that perform self-attention across words inside a phrase to capture context dependencies at the phrase level, and use the gated memory updating mechanism to refine each word’s representation hierarchically with longer-term context dependencies captured in a larger phrase. As a result, the memory consumption can be reduced because the self-attention is performed at the phrase level instead of the sentence level. At the same time, syntactic information can be easily integrated in the model. Experiment results show that PSAN can achieve the state-of-the-art performance across a plethora of NLP tasks including binary and multi-class classification, natural language inference and sentence similarity.
Anthology ID:
D18-1408
Volume:
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Month:
October-November
Year:
2018
Address:
Brussels, Belgium
Editors:
Ellen Riloff, David Chiang, Julia Hockenmaier, Jun’ichi Tsujii
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
3729–3738
Language:
URL:
https://aclanthology.org/D18-1408
DOI:
10.18653/v1/D18-1408
Bibkey:
Cite (ACL):
Wei Wu, Houfeng Wang, Tianyu Liu, and Shuming Ma. 2018. Phrase-level Self-Attention Networks for Universal Sentence Encoding. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 3729–3738, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
Phrase-level Self-Attention Networks for Universal Sentence Encoding (Wu et al., EMNLP 2018)
Copy Citation:
PDF:
https://aclanthology.org/D18-1408.pdf
Video:
 https://aclanthology.org/D18-1408.mp4
Data
SNLISSTSST-2SST-5