Nothing Special   »   [go: up one dir, main page]

Idiom-Aware Compositional Distributed Semantics

Pengfei Liu, Kaiyu Qian, Xipeng Qiu, Xuanjing Huang


Abstract
Idioms are peculiar linguistic constructions that impose great challenges for representing the semantics of language, especially in current prevailing end-to-end neural models, which assume that the semantics of a phrase or sentence can be literally composed from its constitutive words. In this paper, we propose an idiom-aware distributed semantic model to build representation of sentences on the basis of understanding their contained idioms. Our models are grounded in the literal-first psycholinguistic hypothesis, which can adaptively learn semantic compositionality of a phrase literally or idiomatically. To better evaluate our models, we also construct an idiom-enriched sentiment classification dataset with considerable scale and abundant peculiarities of idioms. The qualitative and quantitative experimental analyses demonstrate the efficacy of our models.
Anthology ID:
D17-1124
Volume:
Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing
Month:
September
Year:
2017
Address:
Copenhagen, Denmark
Editors:
Martha Palmer, Rebecca Hwa, Sebastian Riedel
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
1204–1213
Language:
URL:
https://aclanthology.org/D17-1124
DOI:
10.18653/v1/D17-1124
Bibkey:
Cite (ACL):
Pengfei Liu, Kaiyu Qian, Xipeng Qiu, and Xuanjing Huang. 2017. Idiom-Aware Compositional Distributed Semantics. In Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, pages 1204–1213, Copenhagen, Denmark. Association for Computational Linguistics.
Cite (Informal):
Idiom-Aware Compositional Distributed Semantics (Liu et al., EMNLP 2017)
Copy Citation:
PDF:
https://aclanthology.org/D17-1124.pdf
Data
SSTSST-2