Nothing Special   »   [go: up one dir, main page]

skip to main content
article

Binary classification using ensemble neural networks and interval neutrosophic sets

Published: 01 August 2009 Publication History

Abstract

This paper presents an ensemble neural network and interval neutrosophic sets approach to the problem of binary classification. A bagging technique is applied to an ensemble of pairs of neural networks created to predict degree of truth membership, indeterminacy membership, and false membership values in the interval neutrosophic sets. In our approach, the error and vagueness are quantified in the classification process as well. A number of aggregation techniques are proposed in this paper. We applied our techniques to the classical benchmark problems including ionosphere, pima-Indians diabetes, and liver-disorders from the UCI machine learning repository. Our approaches improve the classification performance as compared to the existing techniques which applied only to the truth membership values. Furthermore, the proposed ensemble techniques also provide better results than those obtained from only a single pair of neural networks.

References

[1]
P. Alfeld, Scattered data interpolation in three or more variables, in: Mathematical Methods in Computer Aided Geometric Design, Academic Press, New York, 1989, pp. 1-34.
[2]
Amidror, I., Scattered data interpolation methods for electronic imaging systems: a survey. Journal of Electronic Imaging. v11. 157-176.
[3]
Ash, T., Dynamic node creation in backpropagation networks. Connection Science. v1. 365-375.
[4]
A. Asuncion, D. Newman, UCI machine learning repository, 2007, URL {http://www.ics.uci.edu/JJsimmlearn/MLRepository.html}.
[5]
A. Atiya, On the required size of multilayer networks for implementing real-valued functions, in: IEEE International Conference on Neural Networks, vol. 3, 1994.
[6]
S. Aylward, R. Anderson, An algorithm for neural network architecture generation, in: AIAA Computing in Aerospace VIII, 1991.
[7]
Baum, E., On the capabilities of multilayer perceptrons. Journal of Complexity. v4. 193-215.
[8]
Z. Boger, H. Guterman, Knowledge extraction from artificial neural network models, in: IEEE International Conference on Systems, Man, and Cybernetics, SMC97, vol. 4, 1997.
[9]
Breiman, L., Bagging predictors. Machine Learning. v24 i2. 123-140.
[10]
N. Chawla, T. Moore, K. Bowyer, L. Hall, C. Springer, W. Kegelmeyer, Bagging-like effects for decision trees and neural nets in protein secondary structure prediction, in: ACM SIGKDD Workshop on Data Mining in Bio-Informatics, 2001.
[11]
Cox, T.F. and Cox, M.A.A., Multidimensional Scaling. 2001. second ed. Chapman&Hall, CRC Press, London, Boca Raton, FL.
[12]
Dietterich, T.G., Ensemble methods in machine learning. In: Kittler, J., Roli, F. (Eds.), Multiple Classifier Systems, Lecture Notes in Computer Science, vol. 1857. Springer, Berlin.
[13]
Dietterich, T.G. and Bakiri, G., Solving multiclass learning problems via error-correcting output codes. Journal of Artificial Intelligence Research. v2. 263-286.
[14]
A. Dilo, P. Kraipeerapun, W. Bakker, R. de By, Storing and handling vague spatial objects, in: The 15th International Workshop on Database and Expert Systems Applications, DEXA 2004, 2004.
[15]
M. Duckham, Uncertainty and geographic information: computational and critical convergence, in: Representation in a Digital Geography, Wiley, New York, 2002.
[16]
M. Duckham, J. Sharp, Representing GIS, chap. Uncertainty and geographic information: Computational and critical convergence, John Wiley, New York, 2005, pp. 113-124.
[17]
P.F. Fisher, Models of uncertainty in spatial data, in: Geographical Information Systems: Principles, Techniques, Management and Applications, second ed., Wiley, Chichester, 2005, pp. 69-83.
[18]
P. Hájek, L. Godo, F. Esteva, Fuzzy logic and probability, in: Eleventh Annual Conference on Uncertainty in Artificial Intelligence, Montreal, Quebec, Canada, 1995.
[19]
L.K. Hansen, P. Salamon, Pattern analysis and machine intelligence, in: IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 12, 1990.
[20]
Hirose, Y., Yamashita, I. and Hijiya, S., Back-propagation algorithm which varies the number of hidden units. Neural Networks. v4. 61-66.
[21]
Huang, S.-C. and Huang, Y.-F., Bounds on the number of hidden neurons in multilayer perceptrons. IEEE Transactions on Neural Networks. v2 i1. 47-55.
[22]
ichi Funahashi, K., Multilayer neural networks and Bayes decision theory. Neural Networks. v11 i2. 209-213.
[23]
B. Igelnik, Y.-H. Pao, Estimation of size of hidden layer on basis of bound of generalization error, in: Proceedings of IEEE International Conference on Neural Networks, vol. 4, 1995.
[24]
Islam, M.M., Yao, X. and Murase, K., A constructive algorithm for training cooperative neural network ensembles. IEEE Transactions on Neural Networks. v14 i4. 820-834.
[25]
Kolen, J.F. and Pollack, J.B., Back propagation is sensitive to initial conditions. In: Lippmann, R.P., Moody, J.E., Touretzky, D.S. (Eds.), Advances in Neural Information Processing Systems, vol. 3. Morgan Kaufmann Publishers, Inc., Los Altos, CA.
[26]
Kraipeerapun, P., Fung, C.C. and Brown, W., Assessment of uncertainty in mineral prospectivity prediction using interval neutrosophic set. In: Hao, Y., Liu, J., Wang, Y., ming Cheung, Y., Yin, H., Jiao, L., Ma, J., Jiao, Y.-C. (Eds.), CIS (2), Lecture Notes in Computer Science, vol. 3802. Springer, Berlin.
[27]
Kraipeerapun, P., Fung, C.C., Brown, W. and Wong, K.W., Mineral prospectivity prediction using interval neutrosophic sets. In: Devedzic, V. (Ed.), Artificial Intelligence and Applications, IASTED/ACTA Press.
[28]
P. Kraipeerapun, C.C. Fung, W. Brown, K.W. Wong, Neural network ensembles using interval neutrosophic sets and bagging for mineral prospectivity prediction and quantification of uncertainty, in: Proceedings of the 2006 IEEE International Conferences on Cybernetics and Intelligent Systems, Bangkok, Thailand, 2006.
[29]
P. Kraipeerapun, K.W. Wong, C.C. Fung, W. Brown, Quantification of uncertainty in mineral prospectivity prediction using neural network ensembles and interval neutrosophic sets, in: Proceedings of the 2006 IEEE World Congress on Computational Intelligence: A Joint Conference of the International Joint Conference on Neural Networks, IJCNN 2006, Vancouver, Canada, 2006.
[30]
L. Kulik, Vague spatial reasoning based on supervaluation, in: S. Winter (Ed.), Geographical Domain and Geographical Information Systems-GeoInfo Series, Institute for Geoinformation, Vienna University of Technology, Vienna, vol. 19, 2000.
[31]
J.R. McDonnell, D. Waagen, Determining neural network hidden layer size using evolutionary programming, Professional Paper ADA273242, Naval Command Control and Ocean Surveillance Center RDT and E Division, San Diego, CA, 1993.
[32]
P. Melville, R.J. Mooney, Constructing diverse classifier ensembles using artificial training examples, in: Proceedings of the 18th International Joint Conference on Artificial Intelligence, IJCAI, 2003.
[33]
Opitz, D. and Maclin, R., Popular ensemble methods: an empirical study. Journal of Artificial Intelligence Research. v11. 169-198.
[34]
K. Peng, S.S. GE, C. Wen, An algorithm to determine neural network hidden layer size and weight coefficients, in: Proceedings of the 15th IEEE international Symposium on Intelligent Control, ISIC 2000, 2000.
[35]
Sartori, M.A. and Antsaklis, P.J., A simple method to derive bounds on the size and to train multilayer neural networks. IEEE Transactions on Neural Networks. v2 i4. 467-471.
[36]
Schwenk, H. and Bengio, Y., Boosting neural networks. Neural Computation. v12 i8. 1869-1887.
[37]
Z.-Q. Shen, F.-S. Kong, Dynamically weighted ensemble neural networks for regression problems, in: Proceedings of the Third International Conference on Machine Learning and Cybernetics, Shanghai, vol. 6, 2004.
[38]
P. Smets, Imperfect information: imprecision-uncertainty, in: Uncertainty Management in Information Systems: From Needs to Solutions, Kluwer Academic Publishers, Dordrecht, 1997, pp. 225-254.
[39]
Wang, H., Madiraju, D., Zhang, Y.-Q. and Sunderraman, R., Interval neutrosophic sets. International Journal of Applied Mathematics and Statistics. v3. 1-18.
[40]
H. Wang, F. Smarandache, Y.-Q. Zhang, R. Sunderraman, Interval Neutrosophic Sets and Logic: Theory and Applications in Computing, Neutrosophic Book Series, No.5, {http://arxiv.org/abs/cs/0505014}, 2005.
[41]
Zenobi, G. and Cunningham, P., Using diversity in preparing ensembles of classifiers based on different feature subsets to minimize generalization error. In: Raedt, L.D., Flach, P.A. (Eds.), ECML, Lecture Notes in Computer Science, vol. 2167. Springer, Berlin.

Cited By

View all
  • (2017)Room Occupancy Detection using Modified StackingProceedings of the 9th International Conference on Machine Learning and Computing10.1145/3055635.3056597(162-166)Online publication date: 24-Feb-2017
  • (2014)Effective tumor feature extraction for smart phone based microwave tomography breast cancer screeningProceedings of the 29th Annual ACM Symposium on Applied Computing10.1145/2554850.2554936(674-679)Online publication date: 24-Mar-2014
  • (2011)Combining complementary neural network and error-correcting output codes for multiclass classification problemsProceedings of the 10th WSEAS international conference on Applied computer and applied computational science10.5555/1965610.1965617(49-54)Online publication date: 8-Mar-2011
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

Publisher

Elsevier Science Publishers B. V.

Netherlands

Publication History

Published: 01 August 2009

Author Tags

  1. Binary classification
  2. Feed-forward backpropagation neural network
  3. Interval neutrosophic sets
  4. Uncertainty

Qualifiers

  • Article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 30 Sep 2024

Other Metrics

Citations

Cited By

View all
  • (2017)Room Occupancy Detection using Modified StackingProceedings of the 9th International Conference on Machine Learning and Computing10.1145/3055635.3056597(162-166)Online publication date: 24-Feb-2017
  • (2014)Effective tumor feature extraction for smart phone based microwave tomography breast cancer screeningProceedings of the 29th Annual ACM Symposium on Applied Computing10.1145/2554850.2554936(674-679)Online publication date: 24-Mar-2014
  • (2011)Combining complementary neural network and error-correcting output codes for multiclass classification problemsProceedings of the 10th WSEAS international conference on Applied computer and applied computational science10.5555/1965610.1965617(49-54)Online publication date: 8-Mar-2011
  • (2010)Classification of imbalanced data by combining the complementary neural network and SMOTE algorithmProceedings of the 17th international conference on Neural information processing: models and applications - Volume Part II10.5555/1939751.1939773(152-159)Online publication date: 22-Nov-2010
  • (2010)Using duo output neural network to solve binary classification problemsProceedings of the 10th WSEAS international conference on Applied computer science10.5555/1895260.1895314(286-290)Online publication date: 4-Oct-2010

View Options

View options

Get Access

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media