Nothing Special   »   [go: up one dir, main page]

skip to main content
research-article

Few-shot Incremental Event Detection

Published: 08 February 2024 Publication History

Abstract

Event detection tasks can enable the quick detection of events from texts and provide powerful support for downstream natural language processing tasks. Most such methods can only detect a fixed set of predefined event classes. To extend them to detect a new class without losing the ability to detect old classes requires costly retraining of the model from scratch. Incremental learning can effectively solve this problem, but it requires abundant data of new classes. In practice, however, the lack of high-quality labeled data of new event classes makes it difficult to obtain enough data for model training. To address the above mentioned issues, we define a new task, few-shot incremental event detection, which focuses on learning to detect a new event class with limited data, while retaining the ability to detect old classes to the extent possible. We created a benchmark dataset IFSED for the few-shot incremental event detection task based on FewEvent and propose two benchmarks, IFSED-K and IFSED-KP. Experimental results show that our approach has a higher F1-score than baseline methods and is more stable.

References

[1]
Debanjan Datta. 2019. A small survey on event detection using twitter. ACM Article, 1 (2019), 11.
[2]
Shirong Shen, Tongtong Wu, Guilin Qi, Yuan Fang Li, Gholamreza Haffari, and Sheng Bi. 2021. Adaptive knowledge-enhanced bayesian meta-learning for few-shot event detection. In Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021, 2417–2429.
[3]
Lizheng Gao, Gang Zhou, Junyong Luo, and Mingjing Lan. 2019. Survey on meta-event extraction. JCSTe 46, 8 (2019), 9–15.
[4]
Wenpeng Yin. 2020. Meta-learning for few-shot natural language processing: A survey. arXiv preprint arXiv:2007.09604.
[5]
Shumin Deng, Ningyu Zhang, Jiaojian Kang, Yichi Zhang, Wei Zhang, and Huajun Chen. 2020. Meta-learning with dynamic-memory-based prototypical network for few-shot event detection. In Proceedings of the 13th International Conference on Web Search and Data Mining, 151–159.
[6]
Pengfei Cao, Yubo Chen, Jun Zhao, and Taifeng Wang. 2020. Incremental event detection via knowledge consolidation networks. In Proceedings of the Empirical Methods in Natural Language Processing. 707–717.
[7]
Meng Qu, Tianyu Gao, Louis-Pascal A. C. Xhonneux, and Jian Tang. 2020. Few-shot relation extraction via bayesian meta-learning on relation graphs. In Proceedings of the 37th International Conference on Machine Learning. JMLR.org, 10 pages.
[8]
Meihan Tong, Bin Xu, Shuai Wang, Yixin Cao, Lei Hou, Juanzi Li, and Jun Xie. 2020. Improving event detection via open-domain trigger knowledge. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. 5887–5897.
[9]
Shulin Liu, Yubo Chen, Shizhu He, Kang Liu, and Jun Zhao. 2016. Leveraging framenet to improve automatic event detection. In Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). 2134–2143.
[10]
Collin F. Baker, Charles J. Fillmore, and John B. Lowe. 1998. The berkeley framenet project. In 36th Annual Meeting of the Association for Computational Linguistics and 17th International Conference on Computational Linguistics, Vol. 1, 86–90.
[11]
Xiaoyu Tao, Xiaopeng Hong, Xinyuan Chang, Songlin Dong, Xing Wei, and Yihongn Gong. 2020. Few-shot class-incremental learning. In Proceedings of the 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition. 12180–12189.
[12]
Snell Jake, Swersky Kevin, and Zemel Richard. 2017. Prototypical networks for few-shot learning. Advances in Neural Information Processing Systems 30 (2017).
[13]
HongMing Yang, XuYao Zhang, Fei Yin, and ChengLin Liu. 2018. Robust classification with convolutional prototype learning. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 3474–3482.
[14]
Yue Wu, Yinpeng Chen, Lijuan Wang, Yuancheng Ye, Zicheng Liu, Yandong Guo, and Yun Fu. 2019. Large scale incremental learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 374–382.
[15]
Saihui Hou, Xinyu Pan, Chen Change Loy, Zilei Wang, and Dahua Lin. 2019. Learning a unified classifier incrementally via rebalancing. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 831–839.
[16]
Geoffrey Hinton, Oriol Vinyals, and Jeff Dean. 2015. Distilling the knowledge in a neural network. arXiv preprint arXiv:1503.02531.
[17]
Yubo Chen, Liheng Xu, Kang Liu, Daojian Zeng, and Jun Zhao. 2015. Event extraction via dynamic multi-pooling convolutional neural networks. In Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 1: Long Papers). 167–176.
[18]
Xiao Liu, Zhunchen Luo, and Heyan Huang. 2018. Jointly multiple events extraction via attention-based graph information aggregation. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing.
[19]
Shiyao Cui, Bowen Yu, Tingwen Liu, Zhenyu Zhang, Xuebin Wang, and Jinqiao Shi. 2020. Edge-enhanced graph convolution networks for event detection with syntactic relation. In Findings of the Association for Computational Linguistics: EMNLP 2020.
[20]
Polikar R, Byorick J, Krause S, Marino A, and Moreton M. 2002. Learn++: A classifier independent incremental learning algorithm for supervised neural networks. In Proceedings of the 2002 International Joint Conference on Neural Networks.1742–1747.
[21]
Kirkpatrick J, Pascanu R, and Rabinowitz N. 2016. Overcoming catastrophic forgetting in neural networks. In Proc Natl Acad Sci USA. 114, 13 (2016), 3521–3526.
[22]
Yuwei Cao, Hao Peng, Jia Wu, Yingtong Dou, Jianxin Li, and Philip S Yu. 2021. Knowledge-preserving incremental social event detection via heterogeneous GNNs. In Proceedings of the Web Conference 2021. 3383–3395.
[23]
Yaqing Wang, Quanming Yao, Kwok James T, and Ni Lionel M. 2020. Generalizing from a few examples: A survey on few-shot learning. ACM Computing Surveys 53, 3 (2020), 1–34.
[24]
Jiang Lu, Pinghua Gong, Jieping Ye, and Changshui Zhang. 2020. Learning from very few samples: A survey. arXiv:2009.02653. Retrieved from https://arxiv.org/abs/2009.02653
[25]
Xin Cong, Shiyao Cui, Bowen Yu, Tingwen Liu, Wang Yubin, and Bin Wang. 2021. Few-shot event detection with prototypical amortized conditional random field. In Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021, 28–40.
[26]
Jianming Zheng, Fei Cai, Wanyu Chen, Wengqiang Lei, and Honghui Chen. 2021. Taxonomy-aware learning for few-shot event detection. In Proceedings of the Web Conference 2021. 3546–3557.
[27]
Sha Li, Liyuan Liu, Yiqing Xie, Heng Ji, and Jiawei Han. 2022. P4e: Few-shot event detection as prompt-guided identification and localization. arXiv preprint arXiv:2202.07615.
[28]
Mengye Ren, Renjie Liao, Ethan Fetaya, and Richard S. Zemel. 2019. Incremental few-shot learning with attention attractor networks. In Proceedings of the Advances in Neural Information Processing Systems.
[29]
Yoon Sung Whan, Kim Do-Yeon, Seo Jun, and Moon Jaekyun. 2020. XtarNet: Learning to extract task-adaptive representation for incremental few-shot learning. In Proceedings of the 37th International Conference on Machine Learning. 9.
[30]
Songlin Dong, Xiaopeng Hong, Xiaoyu Tao, Xinyuan Chang, Xing Wei, and Yihong Gong. 2021. Few-shot class-incremental learning via relation knowledge distillation. In Proceedings of the AAAI Conference on Artificial Intelligence.
[31]
Congying Xia, Wenpeng Yin, Yihao Feng, and Philip Yu. 2021. Incremental few-shot text classification with multi-round new classes: Formulation, dataset and system. In Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. 1351–1360.

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Transactions on Asian and Low-Resource Language Information Processing
ACM Transactions on Asian and Low-Resource Language Information Processing  Volume 23, Issue 2
February 2024
340 pages
EISSN:2375-4702
DOI:10.1145/3613556
Issue’s Table of Contents

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 08 February 2024
Online AM: 02 December 2023
Accepted: 20 November 2023
Revised: 11 September 2023
Received: 23 August 2022
Published in TALLIP Volume 23, Issue 2

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Event detection
  2. Few-shot
  3. Incremental learning

Qualifiers

  • Research-article

Funding Sources

  • R&D Program of Beijing Municipal Education Commission
  • National Natural Science Foundation of China
  • National Key Research and Development Program of China

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 152
    Total Downloads
  • Downloads (Last 12 months)152
  • Downloads (Last 6 weeks)12
Reflects downloads up to 13 Nov 2024

Other Metrics

Citations

View Options

Get Access

Login options

Full Access

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Full Text

View this article in Full Text.

Full Text

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media