Nothing Special   »   [go: up one dir, main page]

Skip to main content

Feature Subset Selection with Hybrids of Filters and Evolutionary Algorithms

  • Chapter
Scalable Optimization via Probabilistic Modeling

Part of the book series: Studies in Computational Intelligence ((SCI,volume 33))

  • 1063 Accesses

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. G. John, R. Kohavi, and K. Phleger, “Irrelevant features and the feature subset problem,” in Proceedings of the 11th International Conference on Machine Learning, pp. 121-129, Morgan Kaufmann, San Francisco, CA 1994

    Google Scholar 

  2. R. Kohavi and G. John, “Wrappers for feature subset selection,” Artificial Intelligence, vol. 97, no. 1-2, pp. 273-324, 1997

    Article  MATH  Google Scholar 

  3. E. Cantú-Paz, “Feature subset selection, class separability, and genetic algorithms,” in Genetic and Evolutionary Computation Conference -GECCO-2004, K. Deb et al., (Eds.), Springer, Berlin Heidelberg New York, 2004

    Google Scholar 

  4. I. Guyon and A. Elisseeff, “An introduction to variable and feature selection,” Journal of Machine Learning Research, vol. 3, pp. 1157-1182, 2003

    Article  MATH  Google Scholar 

  5. A. Jain and D. Zongker, “Feature selection: Evaluation, application and small sample performance,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 19, no. 2, pp. 153-158, 1997

    Article  Google Scholar 

  6. A. Blum and P. Langley, “Selection of relevant features and examples in machine learning,” Artificial Intelligence, vol. 97, no. 1-2, pp. 245-271, 1997

    Article  MATH  MathSciNet  Google Scholar 

  7. W. Siedlecki and J. Sklansky, “A note on genetic algorithms for largescale feature selection,” Pattern Recognition Letters, vol. 10, pp. 335-347, 1989

    Article  MATH  Google Scholar 

  8. F. Z. Brill, D. E. Brown, and W. N. Martin, “Genetic algorithms for feature selection for counterpropagation networks,” Tech. Rep. No. IPC-TR-90-004, University of Virginia, Institute of Parallel Computation, Charlottesville, 1990

    Google Scholar 

  9. T. W. Brotherton and P. K. Simpson, “Dynamic feature set training of neural nets for classification,” in Evolutionary Programming IV, J. R. McDonnell, R. G. Reynolds, and D. B. Fogel, (Eds.), pp. 83-94, MIT Cambridge, MA, 1995

    Google Scholar 

  10. J. Bala, K. De Jong, J. Huang, H. Vafaie, and H. Wechsler, “Using learning to facilitate the evolution of features for recognizing visual concepts,” Evolutionary Computation, vol. 4, no. 3, pp. 297-311, 1996

    Article  Google Scholar 

  11. J. D. Kelly and L. Davis, “Hybridizing the genetic algorithm and the K nearest neighbors classification algorithm,” in Proceedings of the Fourth International Conference on Genetic Algorithms, R. K. Belew and L. B. Booker, (Eds.), pp. 377-383, Morgan Kaufmann, San Mateo, CA, 1991

    Google Scholar 

  12. W. F. Punch, E. D. Goodman, M. Pei, L. Chia-Shun, P. Hovland, and R. Enbody, “Further research on feature selection and classification using genetic algorithms,” in Proceedings of the Fifth International Conference on Genetic Algorithms, S. Forrest, (Ed.), pp. 557-564, Morgan Kaufmann, San Mateo, CA, 1993

    Google Scholar 

  13. M. L. Raymer, W. F. Punch, E. D. Goodman, P. C. Sanschagrin, and L. A. Kuhn, “Simultaneous feature scaling and selection using a genetic algorithm,” in Proceedings of the Seventh International Conference on Genetic Algorithms, T. Bäck, (Ed.), pp. 561-567, Morgan Kaufmann, San Francisco, CA, 1997

    Google Scholar 

  14. M. Kudo and K. Sklansky, “Comparison of algorithms that select features for pattern classifiers,” Pattern Recognition, vol. 33, no. 1, pp. 25-41, 2000

    Article  Google Scholar 

  15. H. Vafaie and K. A. De Jong, “Robust feature selection algorithms,” in Proceedings of the International Conference on Tools with Artificial Intelligence. pp. 356-364, IEEE Computer Society, USA 1993

    Google Scholar 

  16. I. Inza, P. Larrañaga, R. Etxeberria, and B. Sierra, “Feature subset se-lection by Bayesian networks based optimization,” Artificial Intelligence, vol. 123, no. 1-2, pp. 157-184, 1999

    Article  Google Scholar 

  17. Erick Cantú-Paz, “Feature subset selection by estimation of distribution algorithms,” in GECCO 2002: Proceedings of the Genetic and Evolution-ary Computation Conference, W. B. Langdon, E. Cantú-Paz, K. Mathias, R. Roy, D. Davis, R. Poli, K. Balakrishnan, V. Honavar, G. Rudolph, J. Wegener, L. Bull, M. A. Potter, A. C. Schultz, J. F. Miller, E. Burke, and N. Jonoska, (Eds.), pp. 303-310, Morgan Kaufmann, San Francisco, CA, 2002

    Google Scholar 

  18. M. L. Raymer, W. F. Punch, E. D. Goodman, L. A. Kuhn, and A. K. Jain, “Dimensionality reduction using genetic algorithms,” IEEE Transactions on Evolutionary Computation, vol. 4, no. 2, pp. 164-171, 2000

    Article  Google Scholar 

  19. I. Inza, P. Larrañaga, and B. Sierra, “Feature subset selection by Bayesian networks: A comparison with genetic and sequential algorithms,” International Journal of Approximate Reasoning, vol. 27, no. 2, pp. 143-164, 2001

    Article  MATH  Google Scholar 

  20. I. Inza, P. Larrañaga, and B. Sierra, “Feature subset selection by estimation of distribution algorithms,” in Estimation of Distribution Algorithms: A new tool for Evolutionary Computation, P. Larrañaga and J. A. Lozano, (Eds.), Kluwer Academic, Dordrecht Hingham, MA 2001

    Google Scholar 

  21. M. Ozdemir, M. J. Embrechts, F. Arciniegas, C. M. Breneman, L. Lock-wood, and K. P. Bennett, “Feature selection for in-silico drug design using genetic algorithms and neural networks,” in IEEE Mountain Workshop on Soft Computing in Industrial Applications. pp. 53-57, IEEE, USA 2001

    Google Scholar 

  22. P.L. Lanzi, “Fast feature selection with genetic algorithms: A wrapper approach,” in IEEE International Conference on Evolutionary Computation. pp. 537-540, IEEE, USA 1997

    Google Scholar 

  23. I.-S. Oh, J.-S. Lee, and C. Suen,“Analysis of class separation and combination of class-dependent features for handwritting recognition,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 21, no. 10, pp. 1089-1094, 1999

    Article  Google Scholar 

  24. C.L. Blake and C.J. Merz, “UCI repository of machine learning databases,” 1998

    Google Scholar 

  25. J. R. Quinlan, “Induction of decision trees,” Machine Learning, vol. 1, pp. 81-106, 1986

    Google Scholar 

  26. G. Harik, E. Cantú-Paz, D. E. Goldberg, and B. L. Miller, “The gambler’s ruin problem, genetic algorithms, and the sizing of populations,” Evolutionary Computation, vol. 7, no. 3, pp. 231-253, 1999

    Article  Google Scholar 

  27. G. R. Harik, F. G. Lobo, and D. E. Goldberg, “The compact genetic algorithm,” in Proceedings of 1998 IEEE Iternational Conference on Evolutionary Computation, Institute of Electrical and Electronics Engineers, pp. 523-528, IEEE Service Center, Piscataway, NJ, 1998

    Google Scholar 

  28. S. Baluja, “Population-based incremental learning: A method for integrating genetic search based function optimization and competitive learning,” Tech. Rep. No. CMU-CS-94-163, Carnegie Mellon University, Pittsburgh, PA, 1994

    Google Scholar 

  29. H. Mühlenbein, “The equation for the response to selection and its use for prediction,” Evolutionary Computation, vol. 5, no. 3, pp. 303-346, 1998

    Article  Google Scholar 

  30. Sanmay Das, “Filters, wrappers and a boosting-based hybrid for feature selection,” in Proceedings of the 18th International Conference on Machine Learning, Carla Brodley and Andrea Danyluk, (Eds.), pp. 74-81, Morgan Kaufmann, San Francisco, CA, 2001

    Google Scholar 

  31. Y. Freund and R. E. Schapire, “Experiments with a new boosting algorithm,” in Proceedings of the Thirteenth International Conference on Machine Learning, L. Saitta, (Ed.), pp. 148-156, Morgan Kaufmann, San Mateo, CA, 1996

    Google Scholar 

  32. M. Matsumoto and T. Nishimura,“Mersenne twister: A623-dimensionally equidistributed uniform pseudorandom number generator,” ACM Transactions on Modeling and Computer Simulation, vol. 8, no. 1, pp. 3-30, 1998

    Article  MATH  Google Scholar 

  33. B. L. Miller and D. E. Goldberg, “Genetic algorithms, selection schemes, and the varying effects of noise,” Evolutionary Computation, vol. 4, no. 2, pp. 113-131, 1996

    Article  Google Scholar 

  34. T. G. Dietterich, “Approximate statistical tests for comparing supervised classification learning algorithms,” Neural Computation, vol. 10, no. 7, pp. 1895-1924, 1998

    Article  Google Scholar 

  35. E. Alpaydin, “Combined 5×2 cv F test for comparing supervised classification algorithms,” Neural Computation, vol. 11, pp. 1885-1892, 1999

    Article  Google Scholar 

  36. J. Reunanen, “Overfitting in making comparisons between variable selec-tion methods,” Journal of Machine Learning Research, vol. 3, pp. 1371-1382,2003

    Article  MATH  Google Scholar 

  37. C. Ambroise and G. J. McLachlan, “Selection bias in gene extraction on the basis of microarray gene-expression data,” Proceedings of the National Academy of Sciences, vol. 99, no. 10, pp. 6562-6566, 2002

    Article  MATH  Google Scholar 

  38. E. Cantu-Paz and C. Kamath, “On the use of evolutionary algorithms in data mining,” in Data Mining: A Heuristic Approach, H. Abbass, R. Sarker, and C. Newton, (Eds.), pp. 48-71. IDEA Group, Hershey, PA, 2002

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Cantú-Paz, E. (2006). Feature Subset Selection with Hybrids of Filters and Evolutionary Algorithms. In: Pelikan, M., Sastry, K., CantúPaz, E. (eds) Scalable Optimization via Probabilistic Modeling. Studies in Computational Intelligence, vol 33. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-34954-9_13

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-34954-9_13

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-34953-2

  • Online ISBN: 978-3-540-34954-9

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics