Nothing Special   »   [go: up one dir, main page]

Skip to main content
Log in

User’s Review Habits Enhanced Hierarchical Neural Network for Document-Level Sentiment Classification

  • Published:
Neural Processing Letters Aims and scope Submit manuscript

Abstract

Document-level sentiment classification is dedicated to predicting the sentiment polarity of document-level reviews posted by users about products and services. Many methods use neural networks have achieved very successful results on sentiment classification tasks. These methods usually focus on mining useful information from the text of the review documents. However, they ignore the importance of users’ review habits. The reviews posted by the same user when commenting on different products contain similar review habits, and reviews that contain highly similar review habits often have similar sentiment ratings. In this paper, we propose a novel sentiment classification algorithm that utilizes user’s review habits to enhance hierarchical neural networks, namely as HUSN. Firstly, we divide the reviews in the training set according to the users. All the reviews of each user are aggregated together and called the historical reviews of this user. Secondly, the target review in the test set and its multiple historical reviews in the training set are sent to the Long Short-Term Memory based hierarchical neural network to obtain the corresponding review document representations containing the user’s review habits. Finally, we calculate the similarities between the target review document representation and multiple historical review document representations. The higher the similarity, the closer the review habits of different reviews from the same user, and the closer the corresponding sentiment ratings. Experimental results show that the similarities between the review habits of different reviews from the same user can further improve the performance of document-level sentiment classification. The HUSN algorithm performs better than all baseline methods on three publicly available document-level review datasets.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

Data Availability and Materials

All data used to support the findings of this study are included within the paper.

Notes

  1. http://ir.hit.edu.cn/dytang/paper/acl2015/dataset.7z.

References

  1. Bengio Y, Simard P, Frasconi P et al (1994) Learning long-term dependencies with gradient descent is difficult. IEEE Trans Neural Netw 5(2):157–166

    Article  Google Scholar 

  2. Chen H, Sun M, Tu C, Lin Y, Liu Z (2016) Neural sentiment classification with user and product attention. In: Proceedings of the 2016 conference on empirical methods in natural language processing, pp 1650–1659

  3. Cireşan D, Meier U, Schmidhuber J (2012) Multi-column deep neural networks for image classification. Eprint Arxiv 157(10):3642–3649

    Google Scholar 

  4. Collobert R, Weston J, Bottou L, Karlen M, Kavukcuoglu K, Kuksa P (2011) Natural language processing (almost) from scratch. J Mach Learn Res 12:2493–2537

    MATH  Google Scholar 

  5. Dahl George E, Dong Yu, Li D, Alex A (2011) Context-dependent pre-trained deep neural networks for large-vocabulary speech recognition. IEEE Trans Audio Speech Lang Process 20(1):30–42

    Article  Google Scholar 

  6. Devlin J, Chang M-W, Lee K, Toutanova Ka (2018) Bert: pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805

  7. Fan R-E, Chang K-W, Hsieh C-J, Wang X-R, Lin C-J (2008) Liblinear: a library for large linear classification. J Mach Learn Res 9:1871–1874

    MATH  Google Scholar 

  8. Gao W, Yoshinaga N, Kaji N, Kitsuregawa M (2013) Modeling user leniency and product popularity for sentiment classification. In: Proceedings of the 6th international joint conference on natural language processing, pp 1107–1111

  9. Gerani S, Mehdad Y, Carenini G, Ng RT, Nejat B (2014) Abstractive summarization of product reviews using discourse structure. In: Proceedings of the 2014 conference on empirical methods in natural language processing (EMNLP), pp 1602–1613

  10. Goldberg AB, Zhu X (2006) Seeing stars when there aren’t many stars: graph-based semi-supervised learning for sentiment categorization. In: Proceedings of the first workshop on graph based methods for natural language processing. Association for Computational Linguistics, pp 45–52

  11. Hochreiter S, Schmidhuber J (1997) Long short-term memory. Neural Comput 9(8):1735–1780

    Article  Google Scholar 

  12. Johnson R, Zhang T (2014) Effective use of word order for text categorization with convolutional neural networks. arXiv preprint arXiv:1412.1058

  13. Kim Y (2014) Convolutional neural networks for sentence classification. arXiv preprint arXiv:1408.5882

  14. Kingma DP, Ba J (2014) Adam: a method for stochastic optimization. arXiv preprint arXiv:1412.6980

  15. Kiritchenko S, Zhu X (2014) Sentiment analysis of short informal texts. J Artif Intell Res 50:723–762

    Article  Google Scholar 

  16. Kong L, Li C, Ge J, Zhang FF, Feng Y, Li Z, Luo B (2020) Leveraging multiple features for document sentiment classification. Inf Sci 518:39–55

    Article  Google Scholar 

  17. Le Q, Mikolov T (2014) Distributed representations of sentences and documents. In: International conference on machine learning, pp 1188–1196

  18. Li J, Li H, Kang X, Yang H, Zong C (2018) Incorporating multi-level user preference into document-level sentiment classification. ACM Trans Asian Low-Resour Lang Inf Process (TALLIP) 18(1):1–17

    Google Scholar 

  19. Liu F, Zheng J, Zheng L, Chen C (2020) Combining attention-based bidirectional gated recurrent neural network and two-dimensional convolutional neural network for document-level sentiment classification. Neurocomputing 371:39–50

    Article  Google Scholar 

  20. Lu Y, Rao Y, Yang J, Yin J (2018) Incorporating lexicons into lstm for sentiment classification. In: 2018 international joint conference on neural networks (IJCNN). IEEE, pp 1–7

  21. Ma D, Li S, Zhang X, Wang H, Sun X (2017) Cascading multiway attentions for document-level sentiment classification. In: Proceedings of the 8th international joint conference on natural language processing (volume 1: long papers), pp 634–643

  22. Mikolov T, Karafiát M, Burget L, Černockỳ J, Khudanpur S (2010) Recurrent neural network based language model. In: 11th annual conference of the international speech communication association

  23. Mikolov T, Sutskever I, Chen K, Corrado GS, Dean J (2013) Distributed representations of words and phrases and their compositionality. In: Advances in neural information processing systems, pp 3111–3119

  24. Pang B, Lee L, Vaithyanathan S (2002) Thumbs up?: sentiment classification using machine learning techniques. In: Proceedings of the ACL-02 conference on Empirical methods in natural language processing, vol 10. Association for Computational Linguistics, pp 79–86

  25. Pu X, Wu G, Yuan C (2019) Exploring overall opinions for document level sentiment classification with structural SVM. Multimed Syst 25(1):21–33

    Article  Google Scholar 

  26. Qian Q, Tian B, Huang M, Liu Y, Zhu X, Zhu X (2015) Learning tag embeddings and tag-specific composition functions in recursive neural network. In: Proceedings of the 53rd annual meeting of the association for computational linguistics and the 7th international joint conference on natural language processing (volume 1: long papers), pp 1365–1374

  27. Qu L, Ifrim G, Weikum G (2010) The bag-of-opinions method for review rating prediction from sparse text patterns. In: Proceedings of the 23rd international conference on computational linguistics. Association for Computational Linguistics, pp 913–921

  28. Song J (2019) Distilling knowledge from user information for document level sentiment classification. In: 2019 IEEE 35th international conference on data engineering workshops (ICDEW). IEEE, pp 169–176

  29. Tang D, Qin B, Liu T (2015) Document modeling with gated recurrent neural network for sentiment classification. In: Proceedings of the 2015 conference on empirical methods in natural language processing, pp 1422–1432

  30. Tang D, Qin B, Liu T (2015) Learning semantic representations of users and products for document level sentiment classification. In: Proceedings of the 53rd annual meeting of the association for computational linguistics and the 7th international joint conference on natural language processing (volume 1: long papers), pp 1014–1023

  31. Tang D, Wei F, Qin B, Yang N, Liu T, Zhou M (2015) Sentiment embeddings with applications to sentiment analysis. IEEE Trans Knowl Data Eng 28(2):496–509

    Article  Google Scholar 

  32. Turney PD (2002) Thumbs up or thumbs down?: semantic orientation applied to unsupervised classification of reviews. In: Proceedings of the 40th annual meeting on association for computational linguistics. Association for Computational Linguistics, pp 417–424

  33. Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez AN , Kaiser Ł, Polosukhin I (2017) Attention is all you need. In: Advances in neural information processing systems, pp 5998–6008

  34. Wang L, Ling W (2016) Neural network-based abstract generation for opinions and arguments. arXiv preprint arXiv:1606.02785

  35. Wu Z, Dai X-Y, Yin C, Huang S, Chen J (2018) Improving review representations with user attention and product attention for sentiment classification. In: 32nd AAAI conference on artificial intelligence

  36. Xia R, Zong C (2010) Exploring the use of word relation features for sentiment classification. In: Proceedings of the 23rd international conference on computational linguistics: posters. Association for Computational Linguistics, pp 1336–1344

  37. Xu J, Chen D, Qiu X, Huang X (2016) Cached long short-term memory neural networks for document-level sentiment classification. arXiv preprint arXiv:1610.04989

  38. Yang Z, Yang D, Dyer C, He X, Smola A, Hovy E (2016) Hierarchical attention networks for document classification. In: Proceedings of the 2016 conference of the North American chapter of the association for computational linguistics: human language technologies, pp 1480–1489

Download references

Acknowledgements

This work was supported by the Major Program of the National Social Science Foundation of China (Grant No. 18ZDA032) and the National Natural Science Foundation of China (Grant No. 61876001).

Funding

It is mentioned in acknowledgements.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Shu Zhao.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Code Availability

The code used during the current study are available from the corresponding author on reasonable request.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Chen, J., Yu, J., Zhao, S. et al. User’s Review Habits Enhanced Hierarchical Neural Network for Document-Level Sentiment Classification. Neural Process Lett 53, 2095–2111 (2021). https://doi.org/10.1007/s11063-021-10423-y

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11063-021-10423-y

Keywords

Navigation