Nothing Special   »   [go: up one dir, main page]

skip to main content
research-article

Cross-Domain Aspect-Based Sentiment Classification with a Pre-Training and Fine-Tuning Strategy for Low-Resource Domains

Published: 15 April 2024 Publication History

Abstract

Aspect-based sentiment classification (ABSC) is a crucial sub-task of fine-grained sentiment analysis, which aims to predict the sentiment polarity of the given aspects in a sentence as positive, negative, or neutral. Most existing ABSC methods are based on supervised learning. However, these methods rely heavily on fine-grained labeled training data, which can be scarce in low-resource domains, limiting their effectiveness. To overcome this challenge, we propose a low-resource cross-domain aspect-based sentiment classification (CDABSC) approach based on a pre-training and fine-tuning strategy. This approach applies the pre-training and fine-tuning strategy to an advanced deep learning method designed for ABSC, namely the attention-based encoding graph convolutional network (AEGCN) model. Specifically, a high-resource domain is selected as the source domain, and the AEGCN model is pre-trained using a large amount of fine-grained annotated data from the source domain. The optimal parameters of the model are preserved. Subsequently, a low-resource domain is used as the target domain, and the pre-trained model parameters are used as the initial parameters of the target domain model. The target domain is fine-tuned using a small amount of annotated data to adapt the parameters to the target domain model, improving the accuracy of sentiment classification in the low-resource domain. Finally, experimental validation on two domain benchmark datasets, restaurant and laptop, demonstrates significant outperformance of our approach over the baselines in CDABSC Micro-F1.

References

[1]
Annisa Nurul Azhar and Masayu Leylia Khodra. 2020. Fine-tuning pretrained multilingual BERT model for Indonesian aspect-based sentiment analysis. In Proceedings of the 2020 7th International Conference on Advance Informatics: Concepts, Theory, and Applications (ICAICTA ’20). IEEE, 1–6.
[2]
Yukun Cao, Yijia Tang, Haizhou Du, Feifei Xu, Ziyue Wei, and Chengkun Jin. 2023. Heterogeneous reinforcement learning network for aspect-based sentiment classification with external knowledge. IEEE Transactions on Affective Computing1 (2023), 1–14.
[3]
Zixuan Cao, Yongmei Zhou, Aimin Yang, and Sancheng Peng. 2021. Deep transfer learning mechanism for fine-grained cross-domain sentiment classification. Connection Science 33, 4 (2021), 911–928.
[4]
Jireh Yi-Le Chan, Khean Thye Bea, Steven Mun Hong Leow, Seuk Wai Phoong, and Wai Khuen Cheng. 2023. State of the art: A review of sentiment analysis based on sequential transfer learning. Artificial Intelligence Review 56, 1 (2023), 749–780.
[5]
Peng Chen, Zhongqian Sun, Lidong Bing, and Wei Yang. 2017. Recurrent attention network on memory for aspect sentiment analysis. In Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing. 452–461.
[6]
Zhuang Chen and Tieyun Qian. 2021. Bridge-based active domain adaptation for aspect term extraction. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers). 317–327.
[7]
Zhuang Chen and Tieyun Qian. 2022. Retrieve-and-edit domain adaptation for end2end aspect based sentiment analysis. IEEE/ACM Transactions on Audio, Speech, and Language Processing 30 (2022), 659–672.
[8]
Kuai Dai, Xutao Li, Xu Huang, and Yunming Ye. 2022. SentATN: Learning sentence transferable embeddings for cross-domain sentiment classification. Applied Intelligence 52 (2022), 18101–18114.
[9]
Jacob Devlin, Ming-Wei Chang, Kenton Lee, and Kristina Toutanova.2019. BERT: Pre-training of deep bidirectional transformers for language understanding. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. 4171–4186.
[10]
Yufeng Diao, Liang Yang, Dongyu Zhang, Linhong Xu, Xiaochao Fan, Di Wu, and Hongfei Lin. 2018. Homographic puns recognition based on latent semantic structures. In Natural Language Processing and Chinese Computing. Lecture Notes in Computer Science, Vol. 10619. Springer, 565–576.
[11]
Chunning Du, Haifeng Sun, Jingyu Wang, Qi Qi, and Jianxin Liao. 2020. Adversarial and domain-aware BERT for cross-domain sentiment analysis. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. 4019–4028.
[12]
Feifan Fan, Yansong Feng, and Dongyan Zhao. 2018. Multi-grained attention network for aspect-level sentiment classification. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. 3433–3442.
[13]
Yanping Fu and Yun Liu. 2022. Domain adaptation with a shrinkable discrepancy strategy for cross-domain sentiment classification. Neurocomputing 494 (2022), 56–66.
[14]
M. P. Geetha and D. Karthika Renuka. 2021. Improving the performance of aspect based sentiment analysis using fine-tuned BERT base uncased model. International Journal of Intelligent Networks 2 (2021), 64–69.
[15]
Chenggong Gong, Jianfei Yu, and Rui Xia. 2020. Unified feature and instance based domain adaptation for aspect-based sentiment analysis. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP ’20). 7035–7045.
[16]
Felix Gräßer, Surya Kallumadi, Hagen Malberg, and Sebastian Zaunseder. 2018. Aspect-based sentiment analysis of drug reviews applying cross-domain and cross-data learning. In Proceedings of the 2018 International Conference on Digital Health. 121–125.
[17]
Shuqin Gu, Lipeng Zhang, Yuexian Hou, and Yin Song. 2018. A position-aware bidirectional attention network for aspect-level sentiment analysis. In Proceedings of the 27th International Conference on Computational Linguistics. 774–784. https://aclanthology.org/C18-1066
[18]
Lin Gui, Ruifeng Xu, Qin Lu, Jun Xu, Jian Xu, Bin Liu, and Xiaolong Wang. 2014. Cross-lingual opinion analysis via negative transfer detection. In Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers). 860–865. https://aclanthology.org/P14-2139.pdf
[19]
Yanping Huang, Hong Peng, Qian Liu, Qian Yang, Jun Wang, David Orellana-Martín, and Mario J. Pérez-Jiménez. 2023. Attention-enabled gated spiking neural P model for aspect-level sentiment classification. Neural Networks 157 (2023), 437–443.
[20]
Xiaomian Kang, Yang Zhao, Jiajun Zhang, and Chengqing Zong. 2021. Enhancing lexical translation consistency for document-level neural machine translation. Transactions on Asian and Low-Resource Language Information Processing 21, 3 (2021), 1–21.
[21]
Xin Li, Lidong Bing, Wai Lam, and Bei Shi. 2018. Transformation networks for target-oriented sentiment classification. In Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). 946–956.
[22]
Zheng Li, Xin Li, Ying Wei, Lidong Bing, Yu Zhang, and Qiang Yang. 2019. Transferable end-to-end aspect-based sentiment analysis with selective adversarial learning. In Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP ’19). 4590–4600.
[23]
Zheng Li, Yun Zhang, Ying Wei, Yuxiang Wu, and Qiang Yang. 2017. End-to-end adversarial memory network for cross-domain sentiment classification. In Proceedings of the International Joint Conference on Artificial Intelligence (IJCAI ’17). 2237–2243.
[24]
Wei Meng, Yongqing Wei, Peiyu Liu, Zhenfang Zhu, and Hongxia Yin. 2019. Aspect based sentiment analysis with feature enhanced attention CNN-BiLSTM. IEEE Access 7 (2019), 167240–167249.
[25]
Quoc Thai Nguyen, Thoai Linh Nguyen, Ngoc Hoang Luong, and Quoc Hung Ngo. 2020. Fine-tuning BERT for sentiment analysis of Vietnamese reviews. In Proceedings of the 2020 7th NAFOSTED Conference on Information and Computer Science (NICS ’20). IEEE, 302–307.
[26]
Abhilash Pathak, Sudhanshu Kumar, Partha Pratim Roy, and Byung-Gyu Kim. 2021. Aspect-based sentiment analysis in Hindi language by ensembling pre-trained mBERT models. Electronics 10, 21 (2021), 2641–2656.
[27]
Sancheng Peng, Rong Zeng, Lihong Cao, Aimin Yang, Jianwei Niu, Chengqing Zong, and Guodong Zhou. 2023. Multi-source domain adaptation method for textual emotion classification using deep and broad learning. Knowledge-Based Systems 260-268 (2023), 110173.
[28]
Maria Pontiki, Dimitris Galanis, Haris Papageorgiou, Ion Androutsopoulos, Suresh Manandhar, Mohammed Al-Smadi, Mahmoud Al-Ayyoub, Yanyan Zhao, Bing Qin, Orphée De Clercq, Véronique Hoste, Marianna Apidianaki, Xavier Tannier, Natalia Loukachevitch, Evginiy Kotelnikov, Nuria Bel, Salud Maria Jiménez-Zafra, and Gulsen Eryigit. 2016. SemEval-2016 Task 5: Aspect based sentiment analysis. In Proceedings of the ProWorkshop on Semantic Evaluation (SemEval ’16). 19–30.
[29]
Maria Pontiki, Dimitrios Galanis, Haris Papageorgiou, Suresh Manandhar, and Ion Androutsopoulos. 2015. SemEval-2015 Task 12: Aspect based sentiment analysis. In Proceedings of the 9th International Workshop on Semantic Evaluation (SemEval ’15). 486–495. http://aclweb.org/anthology/S/S15/S15-2082.pdf
[30]
Maria Pontiki, Dimitris Galanis, John Pavlopoulos, Haris Papageorgiou, and Suresh Manandhar. 2014. SemEval-2014 Task 4: Aspect based sentiment analysis. In Proceedings of the 8th International Workshop on Semantic Evaluation (SemEval ’14). 27–35.
[31]
Alexander Rietzler, Sebastian Stabinger, Paul Opitz, and Stefan Engl. 2020. Adapt or get left behind: Domain adaptation through BERT language model finetuning for aspect-target sentiment classification. In Proceedings of the 12th Language Resources and Evaluation Conference. 4933–4941. https://aclanthology.org/2020.lrec-1.607
[32]
Matheus Gomes Sousa, Kenzo Sakiyama, Lucas de Souza Rodrigues, Pedro Henrique Moraes, Eraldo Rezende Fernandes, and Edson Takashi Matsubara. 2019. BERT for stock market sentiment analysis. In Proceedings of the 2019 IEEE 31st International Conference on Tools with Artificial Intelligence (ICTAI ’19). IEEE, 1597–1601.
[33]
Chi Sun, Xipeng Qiu, Yige Xu, and Xuanjing Huang. 2019. How to fine-tune BERT for text classification? In Chinese Computational Linguistics. Lecture Notes in Computer Science, Vol. 11856. Springer, 194–206.
[34]
Yingjie Tian, Linrui Yang, Yunchuan Sun, and Dalian Liu. 2021. Cross-domain end-to-end aspect-based sentiment analysis with domain-dependent embeddings. Complexity 2021 (2021), 1–11.
[35]
Stefan van Berkum, Sophia van Megen, Max Savelkoul, Pim Weterman, and Flavius Frasincar. 2021. Fine-tuning for cross-domain aspect-based sentiment classification. In Proceedings of the IEEE/WIC/ACM International Conference on Web Intelligence and Intelligent Agent Technology. 524–531.
[36]
Kexin Wang, Yu Zhou, Jiajun Zhang, Shaonan Wang, and Chengqing Zong. 2020. Structurally comparative hinge loss for dependency-based neural text representation. ACM Transactions on Asian and Low-Resource Language Information Processing 19, 4 (2020), 1–19.
[37]
Wenya Wang and Sinno Jialin Pan. 2018. Recursive neural structural correspondence network for cross-domain aspect and opinion co-extraction. In Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). 2171–2181.
[38]
Yequan Wang, Minlie Huang, Xiaoyan Zhu, and Li Zhao. 2016. Attention-based LSTM for aspect-level sentiment classification. In Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing. 606–615.
[39]
Luwei Xiao, Xiaohui Hu, Yinong Chen, Yun Xue, Donghong Gu, Bingliang Chen, and Tao Zhang. 2020. Targeted sentiment classification based on attentional encoding and graph convolutional networks. Applied Sciences 10, 3 (2020), 95–973.
[40]
Hu Xu, Bing Liu, Lei Shu, and Philip Yu. 2019. BERT post-training for review reading comprehension and aspect-based sentiment analysis. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. 2324–2335.
[41]
Ruifeng Xu, Zhiyuan Wen, Lin Gui, Qin Lu, Binyang Li, and Xizhao Wang. 2020. Ensemble with estimation: Seeking for optimization in class noisy data. International Journal of Machine Learning and Cybernetics 11 (2020), 231–248.
[42]
Jinghui Yan, Chengqing Zong, and Jinan Xu. 2023. Combination of loss-based active learning and semi-supervised learning for recognizing entities in Chinese electronic medical records. ACM Transactions on Asian and Low-Resource Language Information Processing 22, 5 (2023), Article 123, 19 pages.
[43]
Min Yang, Wenpeng Yin, Qiang Qu, Wenting Tu, Ying Shen, and Xiaojun Chen. 2019. Neural attentive network for cross-domain aspect-level sentiment classification. IEEE Transactions on Affective Computing 12, 3 (2019), 761–775.
[44]
Jianfei Yu, Chenggong Gong, and Rui Xia. 2021. Cross-domain review generation for aspect-based sentiment analysis. In Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021. Association for Computational Linguistics, 4767–4777.
[45]
Dong Zhang, Weisheng Zhang, Shoushan Li, Qiaoming Zhu, and Guodong Zhou. 2020. Modeling both intra- and inter-modal influence for real-time emotion detection in conversations. In Proceedings of the 28th ACM International Conference on Multimedia. 503–511.
[46]
Kai Zhang, Hefu Zhang, Qi Liu, Hongke Zhao, Hengshu Zhu, and Enhong Chen. 2019. Interactive attention transfer network for cross-domain sentiment classification. In Proceedings of the AAAI Conference on Artificial Intelligence, Vol. 33. 5773–5780.
[47]
Chuanjun Zhao, Suge Wang, and Deyu Li. 2017. Deep transfer learning for social media cross-domain sentiment classification. In Social Media Processing. Communications in Computer and Information Science, Vol. 774. Springer, 232–243.
[48]
Chuanjun Zhao, Suge Wang, Deyu Li, Xianzhi Liu, Xinyi Yang, and Jinfeng Liu. 2021. Cross-domain sentiment classification via parameter transferring and attention sharing mechanism. Information Sciences 578 (2021), 281–296.
[49]
Jie Zhou, Junfeng Tian, Rui Wang, Yuanbin Wu, Wenming Xiao, and Liang He. 2020. SentiX: A sentiment-aware pre-trained model for cross-domain sentiment analysis. In Proceedings of the 28th International Conference on Computational Linguistics. 568–579.
[50]
Yan Zhou, Fuqing Zhu, Pu Song, Jizhong Han, Tao Guo, and Songlin Hu. 2021. An adaptive hybrid framework for cross-domain aspect-based sentiment analysis. In Proceedings of the AAAI Conference on Artificial Intelligence, Vol. 35. 14630–14637.

Index Terms

  1. Cross-Domain Aspect-Based Sentiment Classification with a Pre-Training and Fine-Tuning Strategy for Low-Resource Domains

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Transactions on Asian and Low-Resource Language Information Processing
    ACM Transactions on Asian and Low-Resource Language Information Processing  Volume 23, Issue 4
    April 2024
    221 pages
    EISSN:2375-4702
    DOI:10.1145/3613577
    Issue’s Table of Contents

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 15 April 2024
    Online AM: 21 March 2024
    Accepted: 17 March 2024
    Revised: 03 January 2024
    Received: 12 April 2023
    Published in TALLIP Volume 23, Issue 4

    Check for updates

    Author Tags

    1. Cross-domain aspect-based sentiment classification
    2. pre-training and fine-tuning
    3. domain adaption
    4. transfer learning

    Qualifiers

    • Research-article

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 212
      Total Downloads
    • Downloads (Last 12 months)212
    • Downloads (Last 6 weeks)6
    Reflects downloads up to 16 Feb 2025

    Other Metrics

    Citations

    View Options

    Login options

    Full Access

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Full Text

    View this article in Full Text.

    Full Text

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media