Nothing Special   »   [go: up one dir, main page]

Leveraging Contrastive Learning and Knowledge Distillation for Incomplete Modality Rumor Detection

Fan Xu, Pinyun Fu, Qi Huang, Bowei Zou, AiTi Aw, Mingwen Wang


Abstract
Rumors spread rapidly through online social microblogs at a relatively low cost, causing substantial economic losses and negative consequences in our daily lives. Existing rumor detection models often neglect the underlying semantic coherence between text and image components in multimodal posts, as well as the challenges posed by incomplete modalities in single modal posts, such as missing text or images. This paper presents CLKD-IMRD, a novel framework for Incomplete Modality Rumor Detection. CLKD-IMRD employs Contrastive Learning and Knowledge Distillation to capture the semantic consistency between text and image pairs, while also enhancing model generalization to incomplete modalities within individual posts. Extensive experimental results demonstrate that our CLKD-IMRD outperforms state-of-the-art methods on two English and two Chinese benchmark datasets for rumor detection in social media.
Anthology ID:
2023.findings-emnlp.900
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
13492–13503
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.900
DOI:
10.18653/v1/2023.findings-emnlp.900
Bibkey:
Cite (ACL):
Fan Xu, Pinyun Fu, Qi Huang, Bowei Zou, AiTi Aw, and Mingwen Wang. 2023. Leveraging Contrastive Learning and Knowledge Distillation for Incomplete Modality Rumor Detection. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 13492–13503, Singapore. Association for Computational Linguistics.
Cite (Informal):
Leveraging Contrastive Learning and Knowledge Distillation for Incomplete Modality Rumor Detection (Xu et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.900.pdf