Nothing Special   »   [go: up one dir, main page]

Skip to main content
Log in

An efficient two-state GRU based on feature attention mechanism for sentiment analysis

  • 1203: Applications of Advanced Artificial Intelligence in Multimedia and Information Security
  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

Sentiment analysis is one of the most challenging tasks in natural language processing (NLP). The extensively used application of sentiment analysis is sentiment classification of reviews. The purpose of sentiment classification is to determine the sentiment polarity of user opinion, attitude, and emotions expressed in the form of text into positive, negative and neutral polarities. Many advanced deep learning approaches have been proposed to solve sentiment analysis problem. Recurrent neural network (RNN) is one of the popular deep learning architectures which is widely employed in sentiment analysis. In this paper, we proposed a Two State GRU (TS-GRU) based on feature attention mechanism that concentrates on identifying and categorization of the sentiment polarity using sequential modeling and word-feature seizing. The proposed approach integrates pre-feature attention in TS-GRU to associate the complex connection between words by sentence based sequential modeling and capturing the keywords using attention layer for sentiment polarity. Subsequently, a decoder function has been added in the post-feature attention GRU, in order to extract the predicted features during attention mechanism. The proposed approach has been evaluated on three benchmark datasets including IMDB, MR, and SST2. Experimental results conclude that the proposed TS-GRU model obtained higher sentiment analysis accuracy of 90.85%, 80.72%, and 86.51% on IMDB, MR, and SST2 datasets, respectively.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

References

  1. Acharjya DP, Kauser AP (2016) Acharjya DP, Kauser AP (2016) A survey on big data analytics: challenges, open research issues and tools. Int J Adv Comput Sci Appl 7(2):511–518

  2. Balyan R, McCarthy KS, McNamara DS (2020) Applying natural language processing and hierarchical machine learning approaches to text difficulty classification. Int J Artif Intell Educ 30(3):337–370

    Article  Google Scholar 

  3. Bengio Y, Simard P, Frasconi P (1994) Learning long-term dependencies with gradient descent is difficult. IEEE Trans Neural Netw 5(2):157–166

    Article  Google Scholar 

  4. Camacho-Collados J, Pilehvar MT (2018) On the role of text preprocessing in neural network architectures: An evaluation study on text categorization and sentiment analysis. arXiv Prepr arXiv170701780:40–46

    Google Scholar 

  5. Cho K et al (2014) On the properties of neural machine translation: Encoder–decoder approaches. arXiv 5:1–9

    Google Scholar 

  6. Cho K, Merrienboer B, Gulcehre C, Bahdanau D, Bougares F, Schwenk H et al (2014) Learning phrase representations using RNN encoder-decoder for statistical machine translation. arXiv [Internet]: (September):1–15. Available from: http://arxiv.org/abs/1406.1078

  7. Chung J, Gulcehre C, Cho K, Bengio Y (2014) Empirical evaluation of gated recurrent neural networks on sequence modeling. 1–9. Available from: http://arxiv.org/abs/1412.3555

  8. Do HH, Prasad PWC, Maag A, Alsadoon A (2019) Deep learning for aspect-based sentiment analysis: A comparative review. Expert Syst Appl [Internet] 118:272–99. Available from: https://doi.org/10.1016/j.eswa.2018.10.003

  9. Fu X, Yang J, Li J, Fang M, Wang H (2018) Lexicon-enhanced LSTM with attention for general sentiment analysis. IEEE Access 6(c):71884–71891

    Article  Google Scholar 

  10. Ghazali R, Husaini NA, Ismail LH, Herawan T, Hassim YMM (2014) The performance of a Recurrent HONN for temperature time series prediction. In: 2014 International Joint Conference on Neural Networks (IJCNN) (July). IEEE, Beijing, pp 518–524

  11. Hochreiter S, Schmidhuber J (1997) Long short-term memory. Neural Comput 9(8):1735–1780

    Article  Google Scholar 

  12. Hourri S, Nikolov NS, Kharroubi J (2021) Convolutional neural network vectors for speaker recognition. Int J Speech Technol 24(2):389–400

    Article  Google Scholar 

  13. Hunsinger S (2018) Text Messaging Today: A Longitudinal Study of Variables Influencing Text Messaging from 2009 to 2016. J Inform Syst Appl Res 11(3):25

    Google Scholar 

  14. Kalyanathaya KP, Akila D, Rajesh P (2019) Advances in natural language processing–a survey of current research trends, development tools and industry applications. Int J Recent Technol Eng 7:199–202

    Google Scholar 

  15. Ketkar N (2017) Stochastic gradient descent. In: Deep learning with Python Apress, Berkeley, vol. 1, pp 113–132

  16. Kumar RS, Devaraj AFS, Rajeswari M, Julie EG, Robinson YH, Shanmuganathan V (2021) Exploration of sentiment analysis and legitimate artistry for opinion mining. Multimed Tools Appl 81:11989–12004. https://doi.org/10.1007/s11042-020-10480-w

  17. Lee OJ, Jung JJ (2020) Story embedding: Learning distributed representations of stories based on character networks. Artif Intell 281:103235

    Article  MathSciNet  Google Scholar 

  18. Lipton ZC, Berkowitz J, Elkan C (2015) A critical review of recurrent neural networks for sequence learning. arXiv: 1506. 00019v4 [ cs. LG ] 17 Oct 2015.1–38

  19. Liu B (2020) Text sentiment analysis based on CBOW model and deep learning in big data environment. J Ambient Intell Humaniz Comput [Internet] 11(2):451–8. Available from: https://doi.org/10.1007/s12652-018-1095-6

  20. Long Y, Lu Q, Xiang R, Li M, Huang CR (2017) A cognition based attention model for sentiment analysis. EMNLP 2017 - Conf Empir Methods Nat Lang Process Proc, 462–71

  21. Ma Y, Fan H, Zhao C (2019) Feature-based fusion adversarial recurrent neural networks for text sentiment classification. IEEE Access 7:132542–132551

    Article  Google Scholar 

  22. Maas AL, Daly RE, Pham PT, Huang D, Ng AY, Potts C (2011) Learning word vectors for sentiment analysis. Proc 49th Annu Meet Assoc Computing Linguist Hum Lang Technol 1:142–150

  23. Pang B, Lee L (2005) Seeing stars: Exploiting class relationships for sentiment categorization with respect to rating scales. arXiv preprint cs/0506075.

  24. Parimala M, Swarna PRM, Praveen KRM, Lal CC, Kumar PR, Khan S (2021) Spatiotemporal-based sentiment analysis on tweets for risk assessment of event using deep learning approach. Software: Pract Experience 51(3):550–570

    Google Scholar 

  25. Parkhe V, Biswas B (2016) Sentiment analysis of movie reviews: finding most important movie aspects using driving factors. Soft Comput 20(9):3373–3379

    Article  Google Scholar 

  26. Peng P, Zhang W, Zhang Y, Xu Y, Wang H, Zhang H (2020) Cost sensitive active learning using bidirectional gated recurrent neural networks for imbalanced fault diagnosis. Neurocomputing 407:232–245

    Article  Google Scholar 

  27. Pennington J, Socher R, Manning CD (2014) GloVe: Global vectors for word representation. In: Proceedings of the 2014 conference on empirical methods in natural language processing (EMNLP) (October), Doha, Qatar, pp 1532–1543

  28. Pouyanfar S, Sadiq S, Yan Y, Tian H, Tao Y, Reyes MP et al (2018) A survey on deep learning: Algorithms, techniques, and applications. ACM Comput Surv 51(5):23–51

    Google Scholar 

  29. Qian Q, Huang M, Lei J, Zhu X (2016) Linguistically regularized lstms for sentiment classification. arXiv preprint arXiv:1611.03949

  30. Rahman S, Chakraborty P (2021) Bangla document classification using deep recurrent neural network with BiLSTM. In: Proceedings of International Conference on Machine Intelligence and Data Science Applications. Springer, Singapore, pp 507–519

  31. Sachin S, Tripathi A, Mahajan N, Aggarwal S, Nagrath P (2020) Sentiment analysis using gated recurrent neural networks. SN Comput Sci [Internet] 1(2):1–13. Available from: https://doi.org/10.1007/s42979-020-0076-y

  32. Say B (2021) A unified framework for planning with learned neural network transition models. In: Proceedings of the AAAI Conference on Artificial Intelligence 35(6): 5016–5024

  33. Schuster M, Paliwal KK (1997) Bidirectional recurrent neural networks. IEEE Trans Signal Process 45(11):2673–2681

  34. Serrano E, Bajo J (2019) Deep neural network architectures for social services diagnosis in smart cities. Futur Gener Comput Syst [Internet] 100:122–31. Available from: https://doi.org/10.1016/j.future.2019.05.034

  35. Shiau WL, Dwivedi YK, Lai HH (2018) Examining the core knowledge on facebook. Int J Inf Manag [Internet]. 43(May):52–63. Available from: https://doi.org/10.1016/j.ijinfomgt.2018.06.006

  36. Socher R, Perelygin A, Wu J (2013) Recursive deep models for semantic compositionality over a sentiment treebank. In: Proceedings of the 2013 conference on empirical methods in natural language processing [Internet]. (October):1631-42. Available from: http://nlp.stanford.edu/~socherr/EMNLP2013_RNTN.pdf%5Cn, http://www.aclweb.org/anthology/D13-1170%5Cn, http://aclweb.org/supplementals/D/D13/D13-1170

  37. Socher R, Huval B, Manning CD, Ng AY (2012) Semantic Compositionality through Recursive Matrix-Vector Spaces. Proc 2012 Jt Conf Empir methods Nat Lang Process Comput Nat Lang Learn (July):1201–11

  38. Song H, Kwon B, Yoo H, Lee S (2020) Partial gated feedback recurrent neural network for data compression type classification. IEEE Access 8:151426–151436

    Article  Google Scholar 

  39. Usama M, Xiao W, Ahmad B, Wan J, Hassan MM, Alelaiwi A (2019) Deep learning based weighted feature fusion approach for sentiment analysis. IEEE Access 7:140252–140260

    Article  Google Scholar 

  40. Xing Y, Xiao CA (2019) GRU model for aspect level sentiment analysis. J Phys Conf Ser 1302:032042

  41. Xu G, Meng Y, Qiu X, Yu Z, Wu X (2019) Sentiment analysis of comment texts based on BiLSTM. IEEE Access 7(c):51522–51532

    Article  Google Scholar 

  42. Yang CHH, Qi J, Chen SYC, Chen PY, Siniscalchi SM, Ma X, Lee CH (2021) Decentralizing feature extraction with quantum convolutional neural network for automatic speech recognition. In: ICASSP 2021–2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP): 6523–6527. IEEE

  43. Yang M, Zhao W, Chen L, Qu Q, Zhao Z, Shen Y (2019) Investigating the transferring capability of capsule networks for text classification. Neural Netw [Internet] 2019;118:247–61. Available from: https://doi.org/10.1016/j.neunet.2019.06.014

  44. Zhang D, Tian L, Hong M, Han F, Ren Y, Chen Y (2018) Combining convolution neural network and bidirectional gated recurrent unit for sentence semantic classification. IEEE Access 6:73750–73759

    Article  Google Scholar 

  45. Zulqarnain M, Ghazali R, Ghouse MG, Mushtaq MF (2019) Efficient processing of GRU based on word embedding for text classification. Int J Inf Vis 3(4):377–383

    Google Scholar 

  46. Zulqarnain M, Ghazali R, Ghouse MG, Hassim YMM, Javid I (2020) Predicting financial prices of stock market using recurrent convolutional neural networks. Int J Intell Syst Appl 12(6):21–32

    Google Scholar 

  47. Zulqarnain M, Ishak SA, Ghazali R, Nawi NM (2020) An improved deep learning approach based on variant two-state gated recurrent unit and word embeddings for sentiment classification. Int J Adv Comput Sci Appl 11(1):594–603

    Google Scholar 

  48. Zulqarnain M, Ghazali R, Hassim YMM, Aamir M (2021) An enhanced gated recurrent unit with auto-encoder for solving text classification problems. Arab J Sci Eng 46:8953–8967

    Article  Google Scholar 

  49. Zulqarnain M, Alsaedi AKZ, Ghazali R, Ghouse MG, Sharif W, Husaini NA (2021) A comparative analysis on question classification task based on deep learning approaches. PeerJ Comput Sci 7:e570

    Article  Google Scholar 

Download references

Acknowledgements

The authors would like to thanks Ministry of Education Malaysia, Universiti Tun Hussein Onn Malaysia and Research Management Center (RMC).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Muhammad Zulqarnain.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict interest.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zulqarnain, M., Ghazali, R., Aamir, M. et al. An efficient two-state GRU based on feature attention mechanism for sentiment analysis. Multimed Tools Appl 83, 3085–3110 (2024). https://doi.org/10.1007/s11042-022-13339-4

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-022-13339-4

Keywords

Navigation