Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
G. John, R. Kohavi, and K. Phleger, “Irrelevant features and the feature subset problem,” in Proceedings of the 11th International Conference on Machine Learning, pp. 121-129, Morgan Kaufmann, San Francisco, CA 1994
R. Kohavi and G. John, “Wrappers for feature subset selection,” Artificial Intelligence, vol. 97, no. 1-2, pp. 273-324, 1997
E. Cantú-Paz, “Feature subset selection, class separability, and genetic algorithms,” in Genetic and Evolutionary Computation Conference -GECCO-2004, K. Deb et al., (Eds.), Springer, Berlin Heidelberg New York, 2004
I. Guyon and A. Elisseeff, “An introduction to variable and feature selection,” Journal of Machine Learning Research, vol. 3, pp. 1157-1182, 2003
A. Jain and D. Zongker, “Feature selection: Evaluation, application and small sample performance,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 19, no. 2, pp. 153-158, 1997
A. Blum and P. Langley, “Selection of relevant features and examples in machine learning,” Artificial Intelligence, vol. 97, no. 1-2, pp. 245-271, 1997
W. Siedlecki and J. Sklansky, “A note on genetic algorithms for largescale feature selection,” Pattern Recognition Letters, vol. 10, pp. 335-347, 1989
F. Z. Brill, D. E. Brown, and W. N. Martin, “Genetic algorithms for feature selection for counterpropagation networks,” Tech. Rep. No. IPC-TR-90-004, University of Virginia, Institute of Parallel Computation, Charlottesville, 1990
T. W. Brotherton and P. K. Simpson, “Dynamic feature set training of neural nets for classification,” in Evolutionary Programming IV, J. R. McDonnell, R. G. Reynolds, and D. B. Fogel, (Eds.), pp. 83-94, MIT Cambridge, MA, 1995
J. Bala, K. De Jong, J. Huang, H. Vafaie, and H. Wechsler, “Using learning to facilitate the evolution of features for recognizing visual concepts,” Evolutionary Computation, vol. 4, no. 3, pp. 297-311, 1996
J. D. Kelly and L. Davis, “Hybridizing the genetic algorithm and the K nearest neighbors classification algorithm,” in Proceedings of the Fourth International Conference on Genetic Algorithms, R. K. Belew and L. B. Booker, (Eds.), pp. 377-383, Morgan Kaufmann, San Mateo, CA, 1991
W. F. Punch, E. D. Goodman, M. Pei, L. Chia-Shun, P. Hovland, and R. Enbody, “Further research on feature selection and classification using genetic algorithms,” in Proceedings of the Fifth International Conference on Genetic Algorithms, S. Forrest, (Ed.), pp. 557-564, Morgan Kaufmann, San Mateo, CA, 1993
M. L. Raymer, W. F. Punch, E. D. Goodman, P. C. Sanschagrin, and L. A. Kuhn, “Simultaneous feature scaling and selection using a genetic algorithm,” in Proceedings of the Seventh International Conference on Genetic Algorithms, T. Bäck, (Ed.), pp. 561-567, Morgan Kaufmann, San Francisco, CA, 1997
M. Kudo and K. Sklansky, “Comparison of algorithms that select features for pattern classifiers,” Pattern Recognition, vol. 33, no. 1, pp. 25-41, 2000
H. Vafaie and K. A. De Jong, “Robust feature selection algorithms,” in Proceedings of the International Conference on Tools with Artificial Intelligence. pp. 356-364, IEEE Computer Society, USA 1993
I. Inza, P. Larrañaga, R. Etxeberria, and B. Sierra, “Feature subset se-lection by Bayesian networks based optimization,” Artificial Intelligence, vol. 123, no. 1-2, pp. 157-184, 1999
Erick Cantú-Paz, “Feature subset selection by estimation of distribution algorithms,” in GECCO 2002: Proceedings of the Genetic and Evolution-ary Computation Conference, W. B. Langdon, E. Cantú-Paz, K. Mathias, R. Roy, D. Davis, R. Poli, K. Balakrishnan, V. Honavar, G. Rudolph, J. Wegener, L. Bull, M. A. Potter, A. C. Schultz, J. F. Miller, E. Burke, and N. Jonoska, (Eds.), pp. 303-310, Morgan Kaufmann, San Francisco, CA, 2002
M. L. Raymer, W. F. Punch, E. D. Goodman, L. A. Kuhn, and A. K. Jain, “Dimensionality reduction using genetic algorithms,” IEEE Transactions on Evolutionary Computation, vol. 4, no. 2, pp. 164-171, 2000
I. Inza, P. Larrañaga, and B. Sierra, “Feature subset selection by Bayesian networks: A comparison with genetic and sequential algorithms,” International Journal of Approximate Reasoning, vol. 27, no. 2, pp. 143-164, 2001
I. Inza, P. Larrañaga, and B. Sierra, “Feature subset selection by estimation of distribution algorithms,” in Estimation of Distribution Algorithms: A new tool for Evolutionary Computation, P. Larrañaga and J. A. Lozano, (Eds.), Kluwer Academic, Dordrecht Hingham, MA 2001
M. Ozdemir, M. J. Embrechts, F. Arciniegas, C. M. Breneman, L. Lock-wood, and K. P. Bennett, “Feature selection for in-silico drug design using genetic algorithms and neural networks,” in IEEE Mountain Workshop on Soft Computing in Industrial Applications. pp. 53-57, IEEE, USA 2001
P.L. Lanzi, “Fast feature selection with genetic algorithms: A wrapper approach,” in IEEE International Conference on Evolutionary Computation. pp. 537-540, IEEE, USA 1997
I.-S. Oh, J.-S. Lee, and C. Suen,“Analysis of class separation and combination of class-dependent features for handwritting recognition,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 21, no. 10, pp. 1089-1094, 1999
C.L. Blake and C.J. Merz, “UCI repository of machine learning databases,” 1998
J. R. Quinlan, “Induction of decision trees,” Machine Learning, vol. 1, pp. 81-106, 1986
G. Harik, E. Cantú-Paz, D. E. Goldberg, and B. L. Miller, “The gambler’s ruin problem, genetic algorithms, and the sizing of populations,” Evolutionary Computation, vol. 7, no. 3, pp. 231-253, 1999
G. R. Harik, F. G. Lobo, and D. E. Goldberg, “The compact genetic algorithm,” in Proceedings of 1998 IEEE Iternational Conference on Evolutionary Computation, Institute of Electrical and Electronics Engineers, pp. 523-528, IEEE Service Center, Piscataway, NJ, 1998
S. Baluja, “Population-based incremental learning: A method for integrating genetic search based function optimization and competitive learning,” Tech. Rep. No. CMU-CS-94-163, Carnegie Mellon University, Pittsburgh, PA, 1994
H. Mühlenbein, “The equation for the response to selection and its use for prediction,” Evolutionary Computation, vol. 5, no. 3, pp. 303-346, 1998
Sanmay Das, “Filters, wrappers and a boosting-based hybrid for feature selection,” in Proceedings of the 18th International Conference on Machine Learning, Carla Brodley and Andrea Danyluk, (Eds.), pp. 74-81, Morgan Kaufmann, San Francisco, CA, 2001
Y. Freund and R. E. Schapire, “Experiments with a new boosting algorithm,” in Proceedings of the Thirteenth International Conference on Machine Learning, L. Saitta, (Ed.), pp. 148-156, Morgan Kaufmann, San Mateo, CA, 1996
M. Matsumoto and T. Nishimura,“Mersenne twister: A623-dimensionally equidistributed uniform pseudorandom number generator,” ACM Transactions on Modeling and Computer Simulation, vol. 8, no. 1, pp. 3-30, 1998
B. L. Miller and D. E. Goldberg, “Genetic algorithms, selection schemes, and the varying effects of noise,” Evolutionary Computation, vol. 4, no. 2, pp. 113-131, 1996
T. G. Dietterich, “Approximate statistical tests for comparing supervised classification learning algorithms,” Neural Computation, vol. 10, no. 7, pp. 1895-1924, 1998
E. Alpaydin, “Combined 5×2 cv F test for comparing supervised classification algorithms,” Neural Computation, vol. 11, pp. 1885-1892, 1999
J. Reunanen, “Overfitting in making comparisons between variable selec-tion methods,” Journal of Machine Learning Research, vol. 3, pp. 1371-1382,2003
C. Ambroise and G. J. McLachlan, “Selection bias in gene extraction on the basis of microarray gene-expression data,” Proceedings of the National Academy of Sciences, vol. 99, no. 10, pp. 6562-6566, 2002
E. Cantu-Paz and C. Kamath, “On the use of evolutionary algorithms in data mining,” in Data Mining: A Heuristic Approach, H. Abbass, R. Sarker, and C. Newton, (Eds.), pp. 48-71. IDEA Group, Hershey, PA, 2002
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
Cantú-Paz, E. (2006). Feature Subset Selection with Hybrids of Filters and Evolutionary Algorithms. In: Pelikan, M., Sastry, K., CantúPaz, E. (eds) Scalable Optimization via Probabilistic Modeling. Studies in Computational Intelligence, vol 33. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-34954-9_13
Download citation
DOI: https://doi.org/10.1007/978-3-540-34954-9_13
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-34953-2
Online ISBN: 978-3-540-34954-9
eBook Packages: EngineeringEngineering (R0)