Abstract
Feature selection is important for many learning problems improving speed and quality. Main approaches include individual evaluation and subset evaluation methods. Individual evaluation methods, such as Relief, are efficient but can not detect redundant features, which limits the applications. A new feature selection algorithm removing both irrelevant and redundant features is proposed based on the basic idea of Relief. For each feature, not only effectiveness is evaluated, but also informativeness is considered according to the performance of other features. Experiments on bench mark datasets show that the new algorithm can removing both irrelevant and redundant features and keep the efficiency like a individual evaluation method.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Yu, L., Liu, H.: Efficient Feature Selection Via Analysis of Relevance and Redundancy. J. Mach. Learn. Res. 5, 1205–1224 (2004)
Kira, K., Rendell, L.A.: The Feature Selection Problem: Traditional Methods and a New Algorithm. In: Proceedings of the National Conference on Artificial Intelligence, p. 129. John Wiley & Sons Ltd., Hoboken (1992)
Guyon, I., Elisseeff, A.: An Introduction to Variable and Feature Selection. J. Mach. Learn. Res. 3, 1157–1182 (2003)
Lee, C., Lee, G.G.: Information Gain and Divergence-based Feature Selection for Machine Learning-based Text Categorization. Inform. Process. Manag. 42(1), 155–165 (2006)
Shang, W., Huang, H., Zhu, H., Lin, Y., Qu, Y., Wang, Z.: A Novel Feature Selection Algorithm for Text Categorization. Expert. Syst. Appl. 33(1), 1–5 (2007)
Kononenko, I.: On Biases in Estimating Multi-valued Attributes. In: Proceedings of the 14th International Joint Conference on Artificial Intelligence, vol. 14, pp. 1034–1040. Morgan Kaufmann Publishers Inc., San Francisco (1995)
Breiman, L.: Classification and Regression Trees. Chapman & Hall/CRC (1984)
Kononenko, I.: Estimating Attributes: Analysis and Extensions of RELIEF. In: Bergadano, F., De Raedt, L. (eds.) ECML 1994. LNCS, vol. 784, pp. 171–182. Springer, Heidelberg (1994)
Robnik-Šikonja, M., Kononenko, I.: An Adaptation of Relief for Attribute Estimation in Regression. In: Proceedings of the Fourteenth International Conference on Machine Learning, pp. 296–304. Morgan Kaufmann Publishers Inc., San Francisco (1997)
John, G., Kohavi, R., Pfleger, K.: Irrelevant Features and the Subset Selection Problem. In: Proceedings of the Eleventh International Conference on Machine Learning, vol. 129, pp. 121–129. Morgan Kaufmann Publishers Inc., San Francisco (1994)
Koller, D., Sahami, M.: Toward Optimal Feature Selection. In: Proceedings of the Thirteenth International Conference on Machine Learning, vol. 1996, pp. 284–292. Morgan Kaufmann Publishers Inc., San Francisco (1996)
Hall, M.: Correlation-based Feature Selection for Machine Learning. PhD thesis, The University of Waikato (1999)
Almuallim, H., Dietterich, T.G.: Learning Boolean Concepts in the Presence of Many Irrelevant Features. Artif. Intell. 69(1-2), 279–305 (1994)
Liu, H., Setiono, R.: A Probabilistic Approach to Feature Selection a Filter Solution. In: Machine Learning International Conference, pp. 319–327. Morgan Kaufmann Publishers, Inc., San Francisco (1996)
Robnik-Šikonja, M., Kononenko, I.: Theoretical and Empirical Analysis of ReliefF and RReliefF. Mach. Learn. 53(1), 23–69 (2003)
Kononenko, I., Šimec, E., Robnik-Šikonja, M.: Overcoming the Myopia of Inductive Learning Algorithms with RELIEFF. Appl. Intell. 7(1), 39–55 (1997)
Kononenko, I., Simec, E.: Induction of Decision Trees Using RELIEFF. In: Math. Stat. Method. Artif. Intell. Springer (1995)
Moore, J.H., White, B.C.: Tuning ReliefF for Genome-Wide Genetic Analysis. In: Marchiori, E., Moore, J.H., Rajapakse, J.C. (eds.) EvoBIO 2007. LNCS, vol. 4447, pp. 166–175. Springer, Heidelberg (2007)
Greene, C.S., Penrod, N.M., Kiralis, J., Moore, J.H.: Spatially Uniform ReliefF (SURF) for Computationally-efficient Filtering of Gene-gene Interactions. BioData Min. 2(1), 1–9 (2009)
Zhang, Y., Ding, C., Li, T.: Gene Selection Algorithm by Combining ReliefF and MRMR. BMC Genomics 9(suppl. 2), 27 (2008)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2012 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Wu, T., Xie, K., Nie, C., Song, G. (2012). An Adaption of Relief for Redundant Feature Elimination. In: Wang, J., Yen, G.G., Polycarpou, M.M. (eds) Advances in Neural Networks – ISNN 2012. ISNN 2012. Lecture Notes in Computer Science, vol 7368. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-31362-2_9
Download citation
DOI: https://doi.org/10.1007/978-3-642-31362-2_9
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-31361-5
Online ISBN: 978-3-642-31362-2
eBook Packages: Computer ScienceComputer Science (R0)