Nothing Special   »   [go: up one dir, main page]

Exploring a Unified Sequence-To-Sequence Transformer for Medical Product Safety Monitoring in Social Media

Shivam Raval, Hooman Sedghamiz, Enrico Santus, Tuka Alhanai, Mohammad Ghassemi, Emmanuele Chersoni


Abstract
Adverse Events (AE) are harmful events resulting from the use of medical products. Although social media may be crucial for early AE detection, the sheer scale of this data makes it logistically intractable to analyze using human agents, with NLP representing the only low-cost and scalable alternative. In this paper, we frame AE Detection and Extraction as a sequence-to-sequence problem using the T5 model architecture and achieve strong performance improvements over the baselines on several English benchmarks (F1 = 0.71, 12.7% relative improvement for AE Detection; Strict F1 = 0.713, 12.4% relative improvement for AE Extraction). Motivated by the strong commonalities between AE tasks, the class imbalance in AE benchmarks, and the linguistic and structural variety typical of social media texts, we propose a new strategy for multi-task training that accounts, at the same time, for task and dataset characteristics. Our approach increases model robustness, leading to further performance gains. Finally, our framework shows some language transfer capabilities, obtaining higher performance than Multilingual BERT in zero-shot learning on French data.
Anthology ID:
2021.findings-emnlp.300
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2021
Month:
November
Year:
2021
Address:
Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
Findings
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
3534–3546
Language:
URL:
https://aclanthology.org/2021.findings-emnlp.300
DOI:
10.18653/v1/2021.findings-emnlp.300
Bibkey:
Cite (ACL):
Shivam Raval, Hooman Sedghamiz, Enrico Santus, Tuka Alhanai, Mohammad Ghassemi, and Emmanuele Chersoni. 2021. Exploring a Unified Sequence-To-Sequence Transformer for Medical Product Safety Monitoring in Social Media. In Findings of the Association for Computational Linguistics: EMNLP 2021, pages 3534–3546, Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Exploring a Unified Sequence-To-Sequence Transformer for Medical Product Safety Monitoring in Social Media (Raval et al., Findings 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.findings-emnlp.300.pdf
Software:
 2021.findings-emnlp.300.Software.zip
Video:
 https://aclanthology.org/2021.findings-emnlp.300.mp4
Code
 shivamraval98/multitask-t5_ae