Nothing Special   »   [go: up one dir, main page]

Skip to main content
Log in

A Local context focus learning model for joint multi-task using syntactic dependency relative distance

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

Aspect-based sentiment analysis (ABSA) is a significant task in natural language processing. Although many ABSA systems have been proposed, the correlation between the aspect’s sentiment polarity and local context semantic information was not a point of focus. Moreover, aspect term extraction and aspect sentiment classification are fundamental tasks of aspect-based sentiment analysis. However, most existing systems have failed to recognize the natural relation between these two tasks and therefore treat them as relatively independent tasks. In this work, a local context focus method is proposed. It represents semantic distance using syntactic dependency relative distance which is calculated on the basis of an undirected dependency graph. We introduced this method into a multi-task learning framework with a multi-head attention mechanism for aspect term extraction and aspect sentiment classification joint task. Compared with existing models, the proposed local context focus method measures the semantic distance more precisely and helps our model capture more effective local semantic information. In addition, a multi-head attention mechanism is employed to further enhance local semantic representation. Furthermore, the proposed model makes full use of aspect terminology information and aspect sentiment information provided by the two subtasks, thereby improving the overall performance. The experimental results on four datasets show that the proposed model outperforms single task and multi-task models on the aspect term extraction and aspect sentiment classification tasks.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

References

  1. Lin P, Yang M, Lai J (2021) Deep selective memory network with selective attention and inter-aspect modeling for aspect level sentiment classification. IEEE/ACM Trans Audio Speech Lang Process 29:1093–1106. https://doi.org/10.1109/TASLP.2021.3058540

    Article  Google Scholar 

  2. Zhang B, Li X, Xu X et al (2020) Knowledge guided capsule attention network for aspect-based sentiment analysis. IEEE/ACM Trans Audio Speech Lang Process 28:2538–2551

    Article  Google Scholar 

  3. Chen Z, Qian T (2021) Bridge-based active domain adaptation for aspect term extraction. In: Proceedings of the 59th annual meeting of the association for computational linguistics and the 11th international joint conference on natural language processing, pp 317–327, https://doi.org/10.18653/v1/2021.acl-long.27

  4. He R, Lee W S, Ng H T et al (2019) An interactive multi-task learning network for end-to-end aspect-based sentiment analysis. In: Proceedings of the 57th annual meeting of the association for computational linguistics, Florence, Italy, pp 504–515, https://doi.org/10.18653/v1/P19-1048

  5. Wang W, Pan S J, Dahlmeier D et al (2017) Coupled multi-layer attentionsfor co-extraction of aspect and opinion terms. In: Proceedings of the Thirty-First AAAI conference on artificial intelligence, San Francisco, California, pp 3316–3322

  6. Yang H, Zeng B, Yang J, et al. (2020) A multi-task learning model for chinese-oriented aspect polarity classification and aspect term extraction. Neurocomputing 419:344–356. https://doi.org/10.1016/j.neucom.2020.08.001

    Article  Google Scholar 

  7. Phan M H, Ogunbona P O (2020) Modelling context and syntactical features for aspect-based sentiment analysis. In: Proceedings of the 58th annual meeting of the association for computational linguistics, Online, pp 3211–3220, https://doi.org/10.18653/v1/2020.acl-main.293

  8. Poria S, Cambria E, Ku L W et al (2014) A rule-based approach to aspect extraction from product reviews. In: Proceedings of the Second workshop on natural language processing for social media (SocialNLP), Dublin, Ireland, 28–37, https://doi.org/10.3115/v1/W14-5905

  9. Liu Q, Gao Z, Liu B et al (2015) Automated Rule Selection for Aspect Extraction in Opinion Mining. In: Proceedings of the 24th international conference on artificial intelligence, Buenos Aires, Argentina, pp 1291–1297

  10. Poria S, Cambria E, Gelbukh A (2016) Aspect extraction for opinion mining with a deep convolutional neural network. Knowl Based Syst 108:42–49. https://doi.org/10.1016/j.knosys.2016.06.009

    Article  Google Scholar 

  11. He R, Lee WS, Ng HT et al (2017) An unsupervised neural attention model for aspect extraction. In: Proceedings of the 55th annual meeting of the association for computational linguistics (Volume 1: long papers), Vancouver, Canada, pp 388– 397

  12. Li X, Bing L, Lam W et al (2018) Transformation networks for target-oriented sentiment classification. In: Proceedings of the 56th annual meeting of the association for computational linguistics (Volume 1: long papers), Melbourne, Australia, pp 946–956, https://doi.org/10.18653/v1/P18-1087

  13. Ma D, Li S, Zhang X et al (2017) Interactive attention networks for aspect-level sentiment classification. In: Proceedings of the 26th international joint conference on artificial intelligence. pp 4068–4074, https://doi.org/10.24963/ijcai.2017/568

  14. Liang Y, Meng F, Zhang J et al (2021) A dependency syntactic knowledge augmented interactive architecture for end-to-end aspect-based sentiment analysis. Neurocomputing 454:291– 302

    Article  Google Scholar 

  15. Fan F, Feng Y, Zhao D (2018) Multi-grained attention network for aspect-level sentiment classification. In: Proceedings of the conference on empirical methods in natural language processing, Brussels, Belgium, pp 3433–3442, https://doi.org/10.18653/v1/D18-1380

  16. Wang H, Feng L, Jin Y et al (2021) Surrogate-assisted evolutionary multitasking for expensive minimax optimization in multiple scenarios. IEEE Comput Intell Mag 16(1):34–48. https://doi.org/10.1109/MCI.2020.3039067

    Article  Google Scholar 

  17. Lium X, He P, Chen W et al (2019) Multi-task deep neural networks for natural language understanding. In: Proceedings of the 57th annual meeting of the association for computational linguistics, Florence, Italy. pp 4487–4496

  18. Subramanian S, Trischler A, Bengio Y et al (2018) Learning general purpose distributed sentence representations via large scale multi-task learning. In: Proceedings of the 6th international conference on learning representations(ICLR 2018), Vancouver, BC, Canada

  19. Lu G, Zhao X, Yin J et al (2020) Multi-Task Learning using Variational auto-encoder for Sentiment Classification. Pattern Recognit Lett 132:115–122. https://doi.org/10.1016/j.patrec.2018.06.027

    Article  Google Scholar 

  20. Majumder N, Poria S, Peng H et al (2019) Sentiment and sarcasm classification with multitask Learning. IEEE Intell Syst 34(3):38–43. https://doi.org/10.1109/MIS.2019.2904691

    Article  Google Scholar 

  21. Wu F, Wu C, Liu J (2018) Imbalanced Sentiment Classification with Multi-Task Learning. In: Proceedings of the 27th ACM international conference on information and knowledge management, Torino, Italy, pp 1631–1634, https://doi.org/10.1145/3269206.3269325

  22. Li X, Lam W (2017) Deep multi-task learning for aspect term extraction with memory interaction. In: Proceedings of the conference on empirical methods in natural language Processing, Copenhagen, pp 2886–2892, https://doi.org/10.18653/v1/D17-1310

  23. Akhtar MS, Garg T, Ekbal A (2020) Multi-task learning for aspect term extraction and aspect sentiment classification. Neurocomputing 398:247–256. https://doi.org/10.1016/j.neucom.2020.02.093

    Article  Google Scholar 

  24. Vaswani A, Shazeer N, Parmar N et al (2017) Attention is all you need. In: Advances in neural information processing systems, pp 5998–6008

  25. The task site is http://alt.qcri.org/semeval2014/task4/

  26. The task site is http://alt.qcri.org/semeval2015/task12/

  27. The task site is http://alt.qcri.org/semeval2016/task5/

  28. Liu P, Joty S, Meng H. (2015) Fine-grained opinion mining with recurrent neural networks and word embeddings. In: Proceedings of the 2015 conference on empirical methods in natural language processing, Lisbon, Portugal, pp 1433–1443

  29. Ye H, Yan Z, Luo Z et al (2017) Dependency-Tree Based ConvolutionalNeural Networks for Aspect Term Extraction. In: Pacific-Asia conference on knowledge discovery and data mining. Springer, Cham, pp 350–362, https://doi.org/10.1007/978-3-319-57529-2_28

  30. Devlin J, Chang MW, Lee K et al (2019) BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. In: NAACL-HLT (1), pp 4171–4186

  31. Xu H, Liu B, Shu L et al (2019) BERT post-training for review reading comprehension and aspect-based sentiment analysis. In: Proceedings of the 2019 conference of the North American chapter of the association for computational linguistics: human language technologies, volume 1 (long and short papers), Minneapolis, Minnesota, pp 2324–2335

  32. Song Y, Wang J, Jiang T et al (2019) Targeted Sentiment Classification with AttentionalEncoder Network. In: International conference on artificial neural networks. Springer, Cham, pp 93–103, https://doi.org/10.1007/978-3-030-30490-4_9

  33. Zeng B, Yang H, Xu R et al (2019) LCF: a local context focus mechanism for aspect-based sentiment classification. Appl Sci 9(16):3389. https://doi.org/10.3390/app9163389

    Article  Google Scholar 

  34. Wang F, Lan M, Wang W (2018) Towards a one-stop solution to both aspect extraction and sentiment analysis tasks with neural multi-task learning. In: 2018 International joint conference on neural networks (IJCNN). Rio de Janeiro,Brazil, pp 1–8, https://doi.org/10.1109/IJCNN.2018.8489042

  35. Li X, Bing L, Li P et al (2019) A unified model for opinion target extraction and target sentiment prediction. Proceedings of the AAAI conference on artificial intelligence 33:6714–6721. https://doi.org/10.1609/aaai.v33i01.33016714

    Article  Google Scholar 

Download references

Acknowledgements

This work is partially supported by grant from the Innovative Talents Project of Higher Education Institutions in Liaoning Province (No. WR2019005), the General Project of the Liaoning Provincial Social Science Planning Fund Project (No. L17BTQ005) and the Research Fund Project of Dalian University of Foreign Languages in 2021 (No. 2021XJYB16 and No. 2021XJYB19).

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Rui-Hua Qi or Zheng-Guang Li.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Qi, RH., Yang, MX., Jian, Y. et al. A Local context focus learning model for joint multi-task using syntactic dependency relative distance. Appl Intell 53, 4145–4161 (2023). https://doi.org/10.1007/s10489-022-03684-0

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-022-03684-0

Keywords

Navigation