Nothing Special   »   [go: up one dir, main page]

Skip to main content

Word Representation on Small Background Texts

  • Conference paper
  • First Online:
Social Media Processing (SMP 2016)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 669))

Included in the following conference series:

Abstract

Vector representations of words learned from large scale background texts can be used as useful features in natural language processing and machine learning applications. Word representations in previous works were often trained on large-scale unlabeled texts. However, in some scenarios, large scale background texts are not available. Therefore, in this paper, we propose a novel word representation model based on maximum-margin to train word representation using small set of background texts. Experimental results show many advantages of our method.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    http://nlp.stanford.edu/projects/glove/.

  2. 2.

    https://www.freebase.com/.

References

  1. Bengio, Y., Ducharme, R., Vincent, P., Janvin, C.: A neural probabilistic language model. J. Mach. Learn. Res. 3, 1137–1155 (2003)

    MATH  Google Scholar 

  2. Mikolov, T., Chen, K., Corrado, G., Dean, J: Efficient estimation of word representations in vector space. arXiv preprint arXiv:1301.3781 (2013)

  3. Pennington, J., Socher, R., Manning, C.D.: Glove: global vectors for word representation. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing, Doha, Qatar, pp. 1532–1543 (2014)

    Google Scholar 

  4. Collobert, R., Weston, J., Bottou, L., Karlen, M., Kavukcuoglu, K., Kuksa, P.: Natural language processing (almost) from scratch. J. Mach. Learn. Res. (JMLR) 12, 2493–2537 (2011)

    MATH  Google Scholar 

  5. Dos Santos, C.N., Gatti, M.: Deep convolutional neural networks for sentiment analysis of short texts. In: Proceedings of COLING 2014, Dublin, Ireland, pp. 69–78 (2014)

    Google Scholar 

  6. Yin, J., Wang, J.: A dirichlet multinomial mixture model-based approach for short text clustering. In: Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, New York, USA, pp. 233–242 (2014)

    Google Scholar 

  7. Qiu, S., Cui, Q., Bian, J., Gao, B., Liu, T.-Y: Co-learning of word representations and morpheme representations. In: Proceedings of COLING 2014, the 25th International Conference on Computational Linguistics: Technical Papers, Dublin, Ireland, pp. 141–150 (2014)

    Google Scholar 

  8. Qiu, L., Cao, Y., Nie, Z., Yu, Y., Rui, Y.: Learning word representation considering proximity and ambiguity. In: Proceedings of the Twenty-Eighth {AAAI} Conference on Artificial Intelligence, Québec, Canada, pp. 1572–1578 (2014)

    Google Scholar 

  9. Levy, O., Goldberg, Y.: Dependency-based word embeddings. In: Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics, Maryland, USA, pp. 302–308 (2014)

    Google Scholar 

  10. Mnih, A., Hinton, G.: Three new graphical models for statistical language modelling. In: Proceedings of the 24th International Conference on Machine Learning, Corvallis, USA, pp. 641–648 (2007)

    Google Scholar 

  11. Li, L., Jiang, Z., Huang, D.: A general instance representation architecture for protein-protein interaction extraction. In: 2014 {IEEE} International Conference on Bioinformatics and Biomedicine, Belfast, United Kingdom, pp. 497–500 (2014)

    Google Scholar 

  12. Bunescu, R., Ge, R., Kate, R.J., Marcotte, E.M., Mooney, R.J., Ramani, A.K., Wong, Y.W.: Comparative experiments on learning information extractors for proteins and their interactions. Artif. Intell. Med. 33, 139–155 (2005)

    Article  Google Scholar 

  13. Pyysalo, S., Ginter, F., Heimonen, J., Björne, J., Boberg, J., Järvinen, J., Salakoski, T.: BioInfer: a corpus for information extraction in the biomedical domain. BMC Bioinform. 8, 1 (2007)

    Article  Google Scholar 

  14. Fundel, K., Küffner, R., Zimmer, R.: RelEx—relation extraction using dependency parse trees. Bioinform. 23, 365–371 (2007)

    Article  Google Scholar 

  15. Ding, J., Berleant, D., Nettleton, D., Wurtele, E.: Mining MEDLINE: abstracts, sentences, or phrases. In: Proceedings of the Pacific Symposium on Biocomputing, Lihue, Hawaii, pp. 326–337 (2002)

    Google Scholar 

  16. Nédellec, C.: Learning language in logic-genic interaction extraction challenge. In: Proceedings of the 4th Learning Language in Logic Workshop (LLL 2005), Bonn, Germany (2005)

    Google Scholar 

  17. Byun, H.-R., Lee, S.-W.: Applications of support vector machines for pattern recognition: a survey. In: Lee, S.-W., Verri, A. (eds.) SVM 2002. LNCS, vol. 2388, pp. 213–236. Springer, Heidelberg (2002)

    Chapter  Google Scholar 

Download references

Acknowledgments

The authors gratefully acknowledge the financial support provided by the National Natural Science Foundation of China under No. 61672126, 61173101, 61672127. The Titan X used for this research was donated by the NVIDIA Corporation.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Lishuang Li .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer Nature Singapore Pte Ltd.

About this paper

Cite this paper

Li, L., Jiang, Z., Liu, Y., Huang, D. (2016). Word Representation on Small Background Texts. In: Li, Y., Xiang, G., Lin, H., Wang, M. (eds) Social Media Processing. SMP 2016. Communications in Computer and Information Science, vol 669. Springer, Singapore. https://doi.org/10.1007/978-981-10-2993-6_12

Download citation

  • DOI: https://doi.org/10.1007/978-981-10-2993-6_12

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-10-2992-9

  • Online ISBN: 978-981-10-2993-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics