Nothing Special   »   [go: up one dir, main page]

Skip to main content

Chinese Lyrics Generation Using Long Short-Term Memory Neural Network

  • Conference paper
  • First Online:
Advances in Artificial Intelligence: From Theory to Practice (IEA/AIE 2017)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 10351))

Abstract

Lyrics take a great role to express users’ feelings. Every user has its own patterns and styles of songs. This paper proposes a method to capture the patterns and styles of users and generates lyrics automatically, using Long Short-Term Memory network combined with language model. The Long Short-Term memory network can capture long-term context information into the memory, this paper trains the context representation of each line of lyrics as a sentence vector. And with the recurrent neural network-based language model, lyrics can be generated automatically. Compared to the previous systems based on word frequency, melodies and templates which are hard to be built, the model in this paper is much easier and fully unsupervised. With this model, some patterns and styles can be seen in the generated lyrics of every single user.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Manurung, H., Ritchie, G., Thompson, H.: Towards a computational model of poetry generation. The University of Edinburgh (2000)

    Google Scholar 

  2. Díaz-Agudo, B., Gervás, P., González-Calero, Pedro A.: Poetry generation in COLIBRI. In: Craw, S., Preece, A. (eds.) ECCBR 2002. LNCS, vol. 2416, pp. 73–87. Springer, Heidelberg (2002). doi:10.1007/3-540-46119-1_7

    Chapter  Google Scholar 

  3. Manurung, H.: An evolutionary algorithm approach to poetry generation (2004)

    Google Scholar 

  4. Oliveira, H.G., Cardoso, F.A., Pereira, F.C.: Exploring different strategies for the automatic generation of song lyrics with tra-la-lyrics. In: Proceedings of 13th Portuguese Conference on Artificial Intelligence, EPIA, pp. 57–68 (2007)

    Google Scholar 

  5. Tosa, N., Obara, H., Minoh, M.: Hitch haiku: an interactive supporting system for composing haiku poem. In: Stevens, S.M., Saldamarco, S.J. (eds.) ICEC 2008. LNCS, vol. 5309, pp. 209–216. Springer, Heidelberg (2008). doi:10.1007/978-3-540-89222-9_26

    Chapter  Google Scholar 

  6. Colton, S., Goodwin, J., Veale, T.: Full face poetry generation. In: Proceedings of the Third International Conference on Computational Creativity, pp. 95–102 (2012)

    Google Scholar 

  7. Genzel, D., Uszkoreit, J., Och, F.: Poetic statistical machine translation: rhyme and meter. In: Proceedings of the 2010 Conference on Empirical Methods in Natural Language Processing, pp. 158–166. Association for Computational Linguistics (2010)

    Google Scholar 

  8. He, J., Zhou, M., Jiang, L.: Generating Chinese classical poems with statistical machine translation models. In: AAAI (2012)

    Google Scholar 

  9. Bengio, Y., Ducharme, R., Vincent, P., et al.: A neural probabilistic language model. J. Mach. Learn. Res. 3(Feb), 1137–1155 (2003)

    MATH  Google Scholar 

  10. Mikolov, T., Sutskever, I., Chen, K., et al.: Distributed representations of words and phrases and their compositionality. In: Advances in Neural Information Processing Systems, pp. 3111–3119 (2013)

    Google Scholar 

  11. Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)

    Article  Google Scholar 

  12. Chung, J., Gulcehre, C., Cho, K.H., et al.: Empirical evaluation of gated recurrent neural networks on sequence modeling. arXiv preprint arXiv:1412.3555 (2014)

  13. Mikolov, T., Karafiát, M., Burget, L., et al.: Recurrent neural network based language model. Interspeech 2, 3 (2010)

    Google Scholar 

Download references

Acknowledgement

This paper is supported by the Science and Technology Commission of Shanghai Municipality (16511102400), by Innovation Program of Shanghai Municipal Education Commission (14YZ024), and by the Jiangsu Key Laboratory of Image and Video Understanding for Social Safety (Nanjing University of Science and Technology), Grant No. 30920140122007.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xing Wu .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this paper

Cite this paper

Wu, X., Du, Z., Zhong, M., Dai, S., Liu, Y. (2017). Chinese Lyrics Generation Using Long Short-Term Memory Neural Network. In: Benferhat, S., Tabia, K., Ali, M. (eds) Advances in Artificial Intelligence: From Theory to Practice. IEA/AIE 2017. Lecture Notes in Computer Science(), vol 10351. Springer, Cham. https://doi.org/10.1007/978-3-319-60045-1_43

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-60045-1_43

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-60044-4

  • Online ISBN: 978-3-319-60045-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics