Abstract
Aspect-based sentiment analysis (ABSA) is a significant task in natural language processing. Although many ABSA systems have been proposed, the correlation between the aspect’s sentiment polarity and local context semantic information was not a point of focus. Moreover, aspect term extraction and aspect sentiment classification are fundamental tasks of aspect-based sentiment analysis. However, most existing systems have failed to recognize the natural relation between these two tasks and therefore treat them as relatively independent tasks. In this work, a local context focus method is proposed. It represents semantic distance using syntactic dependency relative distance which is calculated on the basis of an undirected dependency graph. We introduced this method into a multi-task learning framework with a multi-head attention mechanism for aspect term extraction and aspect sentiment classification joint task. Compared with existing models, the proposed local context focus method measures the semantic distance more precisely and helps our model capture more effective local semantic information. In addition, a multi-head attention mechanism is employed to further enhance local semantic representation. Furthermore, the proposed model makes full use of aspect terminology information and aspect sentiment information provided by the two subtasks, thereby improving the overall performance. The experimental results on four datasets show that the proposed model outperforms single task and multi-task models on the aspect term extraction and aspect sentiment classification tasks.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Lin P, Yang M, Lai J (2021) Deep selective memory network with selective attention and inter-aspect modeling for aspect level sentiment classification. IEEE/ACM Trans Audio Speech Lang Process 29:1093–1106. https://doi.org/10.1109/TASLP.2021.3058540
Zhang B, Li X, Xu X et al (2020) Knowledge guided capsule attention network for aspect-based sentiment analysis. IEEE/ACM Trans Audio Speech Lang Process 28:2538–2551
Chen Z, Qian T (2021) Bridge-based active domain adaptation for aspect term extraction. In: Proceedings of the 59th annual meeting of the association for computational linguistics and the 11th international joint conference on natural language processing, pp 317–327, https://doi.org/10.18653/v1/2021.acl-long.27
He R, Lee W S, Ng H T et al (2019) An interactive multi-task learning network for end-to-end aspect-based sentiment analysis. In: Proceedings of the 57th annual meeting of the association for computational linguistics, Florence, Italy, pp 504–515, https://doi.org/10.18653/v1/P19-1048
Wang W, Pan S J, Dahlmeier D et al (2017) Coupled multi-layer attentionsfor co-extraction of aspect and opinion terms. In: Proceedings of the Thirty-First AAAI conference on artificial intelligence, San Francisco, California, pp 3316–3322
Yang H, Zeng B, Yang J, et al. (2020) A multi-task learning model for chinese-oriented aspect polarity classification and aspect term extraction. Neurocomputing 419:344–356. https://doi.org/10.1016/j.neucom.2020.08.001
Phan M H, Ogunbona P O (2020) Modelling context and syntactical features for aspect-based sentiment analysis. In: Proceedings of the 58th annual meeting of the association for computational linguistics, Online, pp 3211–3220, https://doi.org/10.18653/v1/2020.acl-main.293
Poria S, Cambria E, Ku L W et al (2014) A rule-based approach to aspect extraction from product reviews. In: Proceedings of the Second workshop on natural language processing for social media (SocialNLP), Dublin, Ireland, 28–37, https://doi.org/10.3115/v1/W14-5905
Liu Q, Gao Z, Liu B et al (2015) Automated Rule Selection for Aspect Extraction in Opinion Mining. In: Proceedings of the 24th international conference on artificial intelligence, Buenos Aires, Argentina, pp 1291–1297
Poria S, Cambria E, Gelbukh A (2016) Aspect extraction for opinion mining with a deep convolutional neural network. Knowl Based Syst 108:42–49. https://doi.org/10.1016/j.knosys.2016.06.009
He R, Lee WS, Ng HT et al (2017) An unsupervised neural attention model for aspect extraction. In: Proceedings of the 55th annual meeting of the association for computational linguistics (Volume 1: long papers), Vancouver, Canada, pp 388– 397
Li X, Bing L, Lam W et al (2018) Transformation networks for target-oriented sentiment classification. In: Proceedings of the 56th annual meeting of the association for computational linguistics (Volume 1: long papers), Melbourne, Australia, pp 946–956, https://doi.org/10.18653/v1/P18-1087
Ma D, Li S, Zhang X et al (2017) Interactive attention networks for aspect-level sentiment classification. In: Proceedings of the 26th international joint conference on artificial intelligence. pp 4068–4074, https://doi.org/10.24963/ijcai.2017/568
Liang Y, Meng F, Zhang J et al (2021) A dependency syntactic knowledge augmented interactive architecture for end-to-end aspect-based sentiment analysis. Neurocomputing 454:291– 302
Fan F, Feng Y, Zhao D (2018) Multi-grained attention network for aspect-level sentiment classification. In: Proceedings of the conference on empirical methods in natural language processing, Brussels, Belgium, pp 3433–3442, https://doi.org/10.18653/v1/D18-1380
Wang H, Feng L, Jin Y et al (2021) Surrogate-assisted evolutionary multitasking for expensive minimax optimization in multiple scenarios. IEEE Comput Intell Mag 16(1):34–48. https://doi.org/10.1109/MCI.2020.3039067
Lium X, He P, Chen W et al (2019) Multi-task deep neural networks for natural language understanding. In: Proceedings of the 57th annual meeting of the association for computational linguistics, Florence, Italy. pp 4487–4496
Subramanian S, Trischler A, Bengio Y et al (2018) Learning general purpose distributed sentence representations via large scale multi-task learning. In: Proceedings of the 6th international conference on learning representations(ICLR 2018), Vancouver, BC, Canada
Lu G, Zhao X, Yin J et al (2020) Multi-Task Learning using Variational auto-encoder for Sentiment Classification. Pattern Recognit Lett 132:115–122. https://doi.org/10.1016/j.patrec.2018.06.027
Majumder N, Poria S, Peng H et al (2019) Sentiment and sarcasm classification with multitask Learning. IEEE Intell Syst 34(3):38–43. https://doi.org/10.1109/MIS.2019.2904691
Wu F, Wu C, Liu J (2018) Imbalanced Sentiment Classification with Multi-Task Learning. In: Proceedings of the 27th ACM international conference on information and knowledge management, Torino, Italy, pp 1631–1634, https://doi.org/10.1145/3269206.3269325
Li X, Lam W (2017) Deep multi-task learning for aspect term extraction with memory interaction. In: Proceedings of the conference on empirical methods in natural language Processing, Copenhagen, pp 2886–2892, https://doi.org/10.18653/v1/D17-1310
Akhtar MS, Garg T, Ekbal A (2020) Multi-task learning for aspect term extraction and aspect sentiment classification. Neurocomputing 398:247–256. https://doi.org/10.1016/j.neucom.2020.02.093
Vaswani A, Shazeer N, Parmar N et al (2017) Attention is all you need. In: Advances in neural information processing systems, pp 5998–6008
The task site is http://alt.qcri.org/semeval2014/task4/
The task site is http://alt.qcri.org/semeval2015/task12/
The task site is http://alt.qcri.org/semeval2016/task5/
Liu P, Joty S, Meng H. (2015) Fine-grained opinion mining with recurrent neural networks and word embeddings. In: Proceedings of the 2015 conference on empirical methods in natural language processing, Lisbon, Portugal, pp 1433–1443
Ye H, Yan Z, Luo Z et al (2017) Dependency-Tree Based ConvolutionalNeural Networks for Aspect Term Extraction. In: Pacific-Asia conference on knowledge discovery and data mining. Springer, Cham, pp 350–362, https://doi.org/10.1007/978-3-319-57529-2_28
Devlin J, Chang MW, Lee K et al (2019) BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. In: NAACL-HLT (1), pp 4171–4186
Xu H, Liu B, Shu L et al (2019) BERT post-training for review reading comprehension and aspect-based sentiment analysis. In: Proceedings of the 2019 conference of the North American chapter of the association for computational linguistics: human language technologies, volume 1 (long and short papers), Minneapolis, Minnesota, pp 2324–2335
Song Y, Wang J, Jiang T et al (2019) Targeted Sentiment Classification with AttentionalEncoder Network. In: International conference on artificial neural networks. Springer, Cham, pp 93–103, https://doi.org/10.1007/978-3-030-30490-4_9
Zeng B, Yang H, Xu R et al (2019) LCF: a local context focus mechanism for aspect-based sentiment classification. Appl Sci 9(16):3389. https://doi.org/10.3390/app9163389
Wang F, Lan M, Wang W (2018) Towards a one-stop solution to both aspect extraction and sentiment analysis tasks with neural multi-task learning. In: 2018 International joint conference on neural networks (IJCNN). Rio de Janeiro,Brazil, pp 1–8, https://doi.org/10.1109/IJCNN.2018.8489042
Li X, Bing L, Li P et al (2019) A unified model for opinion target extraction and target sentiment prediction. Proceedings of the AAAI conference on artificial intelligence 33:6714–6721. https://doi.org/10.1609/aaai.v33i01.33016714
Acknowledgements
This work is partially supported by grant from the Innovative Talents Project of Higher Education Institutions in Liaoning Province (No. WR2019005), the General Project of the Liaoning Provincial Social Science Planning Fund Project (No. L17BTQ005) and the Research Fund Project of Dalian University of Foreign Languages in 2021 (No. 2021XJYB16 and No. 2021XJYB19).
Author information
Authors and Affiliations
Corresponding authors
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Qi, RH., Yang, MX., Jian, Y. et al. A Local context focus learning model for joint multi-task using syntactic dependency relative distance. Appl Intell 53, 4145–4161 (2023). https://doi.org/10.1007/s10489-022-03684-0
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10489-022-03684-0