Nothing Special   »   [go: up one dir, main page]

BERT Goes Brrr: A Venture Towards the Lesser Error in Classifying Medical Self-Reporters on Twitter

Alham Fikri Aji, Made Nindyatama Nityasya, Haryo Akbarianto Wibowo, Radityo Eko Prasojo, Tirana Fatyanosa


Abstract
This paper describes our team’s submission for the Social Media Mining for Health (SMM4H) 2021 shared task. We participated in three subtasks: Classifying adverse drug effect, COVID-19 self-report, and COVID-19 symptoms. Our system is based on BERT model pre-trained on the domain-specific text. In addition, we perform data cleaning and augmentation, as well as hyperparameter optimization and model ensemble to further boost the BERT performance. We achieved the first rank in both classifying adverse drug effects and COVID-19 self-report tasks.
Anthology ID:
2021.smm4h-1.9
Volume:
Proceedings of the Sixth Social Media Mining for Health (#SMM4H) Workshop and Shared Task
Month:
June
Year:
2021
Address:
Mexico City, Mexico
Editors:
Arjun Magge, Ari Klein, Antonio Miranda-Escalada, Mohammed Ali Al-garadi, Ilseyar Alimova, Zulfat Miftahutdinov, Eulalia Farre-Maduell, Salvador Lima Lopez, Ivan Flores, Karen O'Connor, Davy Weissenbacher, Elena Tutubalina, Abeed Sarker, Juan M Banda, Martin Krallinger, Graciela Gonzalez-Hernandez
Venue:
SMM4H
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
58–64
Language:
URL:
https://aclanthology.org/2021.smm4h-1.9
DOI:
10.18653/v1/2021.smm4h-1.9
Bibkey:
Cite (ACL):
Alham Fikri Aji, Made Nindyatama Nityasya, Haryo Akbarianto Wibowo, Radityo Eko Prasojo, and Tirana Fatyanosa. 2021. BERT Goes Brrr: A Venture Towards the Lesser Error in Classifying Medical Self-Reporters on Twitter. In Proceedings of the Sixth Social Media Mining for Health (#SMM4H) Workshop and Shared Task, pages 58–64, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
BERT Goes Brrr: A Venture Towards the Lesser Error in Classifying Medical Self-Reporters on Twitter (Aji et al., SMM4H 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.smm4h-1.9.pdf
Data
SMM4H