Nothing Special   »   [go: up one dir, main page]

Skip to main content
Log in

Variational Bayesian multinomial logistic Gaussian process classification

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

The multinomial logistic Gaussian process is a flexible non-parametric model for multi-class classification tasks. These tasks are often involved in solving a pattern recognition problem in real life. In such contexts, the multinomial logistic function (or softmax function) is usually assumed to be the likelihood function. But, exact inferences for this model have proved challenging problem because it requires high-dimensional integration. In this paper, we propose approximate variational Bayesian inference for the multinomial logistic Gaussian process model. First, we compute the second-order approximation for the logarithm of the logistic likelihood function using Taylor series expansion, and derive the posterior distributions of all hidden variables and model parameters using the variational Bayesian inference method. Second, we derive the predictive distribution of the latent classification variable corresponding to the relevant test data point using the characteristics of the Cauchy product for a standard Gaussian process using a learning model parameter. We conducted experiments to verify the effectiveness of the proposed model using a number of synthetic and real datasets. The results show that the proposed model has superior classification capability to existing methods.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

References

  1. Beal MJ (2003) Variational Algorithms for Approximate Bayesian Inference. A Thesis submitted for the degree of Doctor of Philosophy of the university of London

  2. Chai KMA (2012) Variational Multinomial Logit Gaussian Process. J Mach Learn Res 13:1745–1808

    MathSciNet  MATH  Google Scholar 

  3. Csato L et al (2000) Efficient Approaches to Gaussian Process Classification. In: Neural Information Processing Systems 12, pp 251–257, MIT Press

  4. Drugowitsch J (2014, Jun) Variational Bayesian Inference for Linear and Logistic Regression, eprint arXiv:1310.5438v2

  5. Ghahramani Z, Beal MJ (2000a) Graphical models and variational methods. In: Saad D, Opper M (eds) Advanced Mean Field methods-Theory and Practice. MIT Press, Cambridge

    Google Scholar 

  6. Ghahramani Z, Beal MJ (2000b) Variational inference for Bayesian mixtures of factor analyzers. In: Solla SA, Leen TK, Muller K-R (eds) Advances in Neural Information Processing Systems, vol 12. MIT Press, Cambridge, MA, pp 449–455

    Google Scholar 

  7. Gibbs MN (1997) Bayesian Gaussian Processes for Regression and Classification, Ph.D. thesis, Inferential Sciences Group, Cavendish laboratory, Cambridge University

  8. Gibbs MN, Mackay DJC (2000) Variational Gaussian Process classifiers. IEEE Trans Neural Netw 11(6):1458–1464

    Article  Google Scholar 

  9. Girolani M, Rogers S (2005) Variational Bayesian Multinomial Probit Regression with Gaussian process priors, technical report: TR-2005-205, Depart. Of computer Science, University of Glasgow

  10. Kim HC, Ghahramani Z (2006) Bayesian Gaussian Process classification with the EM-EP Algorithm. IEEE Transaction in PAMI 28:1945–1958

    Google Scholar 

  11. Lama N, Girolami M (2008) vbmp: Variational Bayesian Multinomial Probit regression for multi-class classification in R. Bioinformatics 24(1):135–136

    Article  Google Scholar 

  12. Mackay DJC (1998) Introduction To Gaussian Processes, NIPS’97 Tutorial Notes

  13. Minka TP (2001) Expectation Propagation for Approximate Bayesian Inference. In UAI, Morgan Kaufmann 362-369

  14. Neal RM (1998) Regression and Classification Using Gaussian Process Priors, Bayesian Statistics, vol 6. Oxford University Press, Oxford, pp 000–000

    Google Scholar 

  15. Nicklisch H, Rasmussen CE (2008) Approximation for Binary Gaussian process Classification. Journal of Machine Learning Research 9:2035–2075

    MathSciNet  MATH  Google Scholar 

  16. Opper M, Archambeau C (2009) The Variaitonal Gaussian Approximation Revisited, Neural Computation,  21(3):786–792

  17. Rasmussen CE, Williams CKI (2006) Gaussian Processes for Machine Learning. MIT Press, Cambridge

    MATH  Google Scholar 

  18. Seeger M, Jordan MI (2004) Sparse Gaussian Process Classification With Multiple Classes, Technical Report TR 661, Department of Statistics, University of California at Berkeley

  19. Shi JQ, Murray-Smith R, Titterington DM (2003) Bayesian regression and Classification Using Mixtures of Gaussian Processes. Int J Adapt Control Signal Process 17:1–16

    Article  MATH  Google Scholar 

  20. Williams CKI, Barber D (1998) Bayesian Classification with Gaussian Processes. IEEE Tran On PAMI 12:1342–1351

    Article  Google Scholar 

  21. Williams CKI, Rasmussen CE (1995) Gaussian Processes for Regression. Advances in Neural Information Processing Systems 8;514–520. MIT Press

  22. FISHER, R.A.(1936) The Use of Multiple Measurements in Taxonomic Problems. Annals of Eugenics, 7(2), pp.179–188. Available Iris dataset at : http://archive.ics.uci.edu/ml/datasets/Iris

  23. S. Aeberhard, D. Coomans and O. de Vel, Comparison of Classifiers in High Dimensional Settings, Tech. Rep. no. 92-02, (1992), Dept. of Computer Science and Dept. of Mathematics and Statistics, James Cook University of North Queensland. (Also submitted to Technometrics). Available Wine dataset at: http://archive.ics.uci.edu/ml/datasets/Wine

  24. Quinlan,J.R., Compton,P.J., Horn,K.A., & Lazurus,L. (1986). Inductive knowledge acquisition: A case study. In Proceedings of the Second Australian Conference on Applications of Expert Systems. Sydney, Australia. Avaliable Thyroid dataset at: http://archive.ics.uci.edu/ml/datasets/Thyroid+Disease

Download references

Acknowledgments

This work was partially supported by Korea National Research Foundation (NRF-2017R1D1A1B03028808) and Basic Science Research Program through the National Research Foundation of Korea(NRF) funded by the Ministry of Science, ICT & Future Planning(NRF-2015R1C1A1A02036495).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Inseop Na.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Cho, W., Na, I., Kim, S. et al. Variational Bayesian multinomial logistic Gaussian process classification. Multimed Tools Appl 77, 18563–18582 (2018). https://doi.org/10.1007/s11042-017-5210-z

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-017-5210-z

Keywords

Navigation