%0 Conference Proceedings %T RESIDE: Improving Distantly-Supervised Neural Relation Extraction using Side Information %A Vashishth, Shikhar %A Joshi, Rishabh %A Prayaga, Sai Suman %A Bhattacharyya, Chiranjib %A Talukdar, Partha %Y Riloff, Ellen %Y Chiang, David %Y Hockenmaier, Julia %Y Tsujii, Jun’ichi %S Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing %D 2018 %8 oct nov %I Association for Computational Linguistics %C Brussels, Belgium %F vashishth-etal-2018-reside %X Distantly-supervised Relation Extraction (RE) methods train an extractor by automatically aligning relation instances in a Knowledge Base (KB) with unstructured text. In addition to relation instances, KBs often contain other relevant side information, such as aliases of relations (e.g., founded and co-founded are aliases for the relation founderOfCompany). RE models usually ignore such readily available side information. In this paper, we propose RESIDE, a distantly-supervised neural relation extraction method which utilizes additional side information from KBs for improved relation extraction. It uses entity type and relation alias information for imposing soft constraints while predicting relations. RESIDE employs Graph Convolution Networks (GCN) to encode syntactic information from text and improves performance even when limited side information is available. Through extensive experiments on benchmark datasets, we demonstrate RESIDE’s effectiveness. We have made RESIDE’s source code available to encourage reproducible research. %R 10.18653/v1/D18-1157 %U https://aclanthology.org/D18-1157 %U https://doi.org/10.18653/v1/D18-1157 %P 1257-1266