Nothing Special   »   [go: up one dir, main page]

Skip to main content

Generalized Information-Theoretic Measures for Feature Selection

  • Conference paper
Adaptive and Natural Computing Algorithms (ICANNGA 2013)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 7824))

Included in the following conference series:

  • 1790 Accesses

Abstract

Information-theoretic measures are frequently employed to select the most relevant subset of features from datasets. This paper focuses on the analysis of continuous-valued features. We compare the common approach with discretization of features prior the analysis, to the direct usage of exact values. Due to the overwhelming costs of computing continuous information-theoretic measures based on Shannon entropy the Renyi and Tsallis generalized measures are considered. To enable computation with continuous Tsallis measures a novel modification of the information potential is introduced. The quality of the analysed measures was assessed indirectly through the classification accuracy in conjuction with the greedy feature selection process. The experiments on datasets from UCI repository show considerable improvements of the results when using both generalized continuous measures.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Dash, M., Liu, H.: Feature Selection for Classification. Intelligent Data Analysis 1(1-4), 131–156 (1997)

    Article  Google Scholar 

  2. Blum, A., Langley, P.: Selection of relevant features and examples in machine Learning. Artificial Intelligence 97(1-2), 245–271 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  3. Lewis, D.: Feature selection and feature extraction for text categorization. In: Proceedings of Speech and Natural Language Workshop, pp. 212–217. Morgan Kaufmann, San Francisco (1992)

    Chapter  Google Scholar 

  4. Liu, H., Sun, J., Liu, L., Zhang, H.: Feature selection with dynamic mutual information. Pattern Recognition 42(7), 1330–1339 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  5. Estevez, P.A., Tesmer, M., Perez, C.A., Zurada, J.M.: Normalized Mutual Information Feature Selection. IEEE Transactions on Neural Networks 20(2), 189–201 (2009)

    Article  Google Scholar 

  6. Lopes, F.M., Martins, D.C., Cesar, R.M.: Feature selection enviroment for genomic applications. BMC Bioinformatics 9(1), 451–458 (2008)

    Article  Google Scholar 

  7. Fleuret, F.: Fast binary feature selection with conditional mutual information. Journal of Machine Learning Research (5), 1531–1555 (2004)

    Google Scholar 

  8. Hild Il, K.E., Erdogmus, D., Torkkola, K., Principe, J.C.: Feature Extraction Using Information-Theoretic Learning. IEEE Transactions on Pattern Analysis and Machine Intelligence 28(9), 1385–1392 (2006)

    Article  Google Scholar 

  9. Furuichi, S.: Information theoretical properties of Tsallis entropies. J. of Mathematical Physics 47(2) (2006)

    Google Scholar 

  10. Mejía-Lavalle, M., Morales, E.F., Arroyo, G.: Two Simple and Effective Feature Selection Methods for Continuous Attributes with Discrete Multi-class. In: Gelbukh, A., Kuri Morales, Á.F. (eds.) MICAI 2007. LNCS (LNAI), vol. 4827, pp. 452–461. Springer, Heidelberg (2007)

    Chapter  Google Scholar 

  11. Vila, M., Bardera, A., Feixas, M., Sbert, M.: Tsallis Mutual Information for Document Classification. Entropy (13), 1694–1707 (2011)

    Google Scholar 

  12. Oh, O.I.-S., Lee, J.-S., Moon, B.-R.: Hybrid genetic algorithms for feature selection. IEEE Transactions on Pattern Analysis and Machine Intelligence 26(11), 1424–1437 (2004)

    Article  Google Scholar 

  13. Jianping, H., Waibhav, T.D., Dougherty, E.R.: Performance of feature-selection methods in the classification of high-dimension data. Pattern Recognition 42(3), 409–442 (2009)

    Article  MATH  Google Scholar 

  14. Lin, S.-W., Tseng, T.-Y., Chou, S.-Y., Chen, S.-C.: A simulated-annealing-based approach for simultaneous parameter optimization and featureselection of back-propagation networks. Expert Systems with Application 34(2) (2008)

    Google Scholar 

  15. Somol, P., Pudil, P., Kittler, J.: Fast branch & bound algorithms for optimal feature selection. IEEE Transactions on Pattern Analysis and Machine Intelligence 26(7), 900–912 (2004)

    Article  Google Scholar 

  16. Tang, E.-K., Suganthan, P., Yao, X.: Gene selection algorithms for microarray data based on least square support vector machine. BMC Bioinformatics 7, 95 (2006)

    Article  Google Scholar 

  17. Hall, M., Frank, E., Holmes, G., Pfahringer, B., Reutemann, P., Witten, I.H.: The WEKA Data Mining Software: An Update. SIGKDD Explorations 11(1) (2009)

    Google Scholar 

  18. Frank, A., Asuncion, A.: UCI Machine Learning Repository. University of California, School of Information and Computer Science, Irvine, CA (2012), http://archive.ics.uci.edu/ml

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2013 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Sluga, D., Lotric, U. (2013). Generalized Information-Theoretic Measures for Feature Selection. In: Tomassini, M., Antonioni, A., Daolio, F., Buesser, P. (eds) Adaptive and Natural Computing Algorithms. ICANNGA 2013. Lecture Notes in Computer Science, vol 7824. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-37213-1_20

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-37213-1_20

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-37212-4

  • Online ISBN: 978-3-642-37213-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics