Nothing Special   »   [go: up one dir, main page]

Term Definitions Help Hypernymy Detection

Wenpeng Yin, Dan Roth


Abstract
Existing methods of hypernymy detection mainly rely on statistics over a big corpus, either mining some co-occurring patterns like “animals such as cats” or embedding words of interest into context-aware vectors. These approaches are therefore limited by the availability of a large enough corpus that can cover all terms of interest and provide sufficient contextual information to represent their meaning. In this work, we propose a new paradigm, HyperDef, for hypernymy detection – expressing word meaning by encoding word definitions, along with context driven representation. This has two main benefits: (i) Definitional sentences express (sense-specific) corpus-independent meanings of words, hence definition-driven approaches enable strong generalization – once trained, the model is expected to work well in open-domain testbeds; (ii) Global context from a large corpus and definitions provide complementary information for words. Consequently, our model, HyperDef, once trained on task-agnostic data, gets state-of-the-art results in multiple benchmarks
Anthology ID:
S18-2025
Volume:
Proceedings of the Seventh Joint Conference on Lexical and Computational Semantics
Month:
June
Year:
2018
Address:
New Orleans, Louisiana
Editors:
Malvina Nissim, Jonathan Berant, Alessandro Lenci
Venue:
*SEM
SIGs:
SIGLEX | SIGSEM
Publisher:
Association for Computational Linguistics
Note:
Pages:
203–213
Language:
URL:
https://aclanthology.org/S18-2025
DOI:
10.18653/v1/S18-2025
Bibkey:
Cite (ACL):
Wenpeng Yin and Dan Roth. 2018. Term Definitions Help Hypernymy Detection. In Proceedings of the Seventh Joint Conference on Lexical and Computational Semantics, pages 203–213, New Orleans, Louisiana. Association for Computational Linguistics.
Cite (Informal):
Term Definitions Help Hypernymy Detection (Yin & Roth, *SEM 2018)
Copy Citation:
PDF:
https://aclanthology.org/S18-2025.pdf
Data
DBpediaEVALutionYAGO