Nothing Special   »   [go: up one dir, main page]

skip to main content
article

Using a Neural Network to Approximate an Ensemble of Classifiers

Published: 01 December 2000 Publication History

Abstract

Several methods (e.g., Bagging, Boosting) of constructing and combining an ensemble of classifiers have recently been shown capable of improving accuracy of a class of commonly used classifiers (e.g., decision trees, neural networks). The accuracy gain achieved, however, is at the expense of a higher requirement for storage and computation. This storage and computation overhead can decrease the utility of these methods when applied to real-world situations. In this Letter, we propose a learning approach which allows a single neural network to approximate a given ensemble of classifiers. Experiments on a large number of real-world data sets show that this approach can substantially save storage and computation while still maintaining accuracy similar to that of the entire ensemble.

References

[1]
1. Breiman, L.: Bagging predictors, Machine Learning, 24 (1996), 123-140.
[2]
2. Freund, Y. and Schapire, R.: Experiments with a new boosting algorithm, In: Proc. Thirteenth Nat. Conf. Machine Learning, Morgan Kaufmann, 1996, pp. 148-156
[3]
3. Quinlan, J. R.: Bagging, boosting, and c4.5, In: Proc. Thirteenth Nat. Conf. Artificial Intelligence, AAAI/MIT Press, 1996, pp. 725-730.
[4]
4. Bauer, E. and Kohavi, R.: An empirical comparison of voting classification algorithms: bagging, boosting and variants, Machine Learning, 36 (1999), 105-139.
[5]
5. Maclin, R. and Opitz, D.: An empirical evaluation of bagging and boosting, In: Proc. Fourteenth Nat. Conf. Artificial Intelligence, AAAI/MIT Press, 1997, pp. 546-551.
[6]
6. Dietterich, T. G.: Machine-learning research - four current directions, AI Magazine, Winter (1997), 97-136.
[7]
7. Margineantu, D. D. and Dietterich, T. G.: Pruning adaptive boosting, In: Proc. Fourteenth Int. Conf. Machine Learning, 1997, pp. 98-106.
[8]
8. Dominggos, P.: Knowledge acquisition from examples vis multiple models, In: Proc. Fourteenth Int. Conf. Machine Learning, 1997, pp. 211-218.
[9]
9. Craven, M. W. and Shavlik, J. W.: Learning symbolic rules using artificial neural networks, In: Proc. 10th Int. Conf. Machine Learning, Amherst, MA, Kaufmann, 1993, pp. 73-80.
[10]
10. Craven, M. W. and Shavlik, J. W.: Extracting tree-structured representation from trained networks, In: D. S. Touretzky, M. C. Mozer and M. Hasselmo, eds., Advances in Neural Information Processing System 8, MIT Press, 1996, pp. 24-30.
[11]
11. Efron, B. and Tibshirani, R.: An Introduction to the Bootstrap, New York, Chapman and Hall, 1993.
[12]
12. Merz, C. J. and Murphy, P. M.: UCI repository of machine learning databases, http://www.ics.uci.edu/~mlearn/MLRepository.html, 1996.
[13]
13. Breiman, L., Friedman, J. H., Olshen, R. A. and Stone, C. J.: Classification and Regression Trees, Wadsworth International Group, 1984.
[14]
14. Kohavi, R.: A study of cross-validation and bootstrap for accuracy estimation and model selection, In: Proc. Int. Joint Conf. Artificial Intelligence, 1995, pp. 1137-1143.

Cited By

View all
  • (2021)Does knowledge distillation really work?Proceedings of the 35th International Conference on Neural Information Processing Systems10.5555/3540261.3540790(6906-6919)Online publication date: 6-Dec-2021
  • (2016)Wise teachers train better DNN acoustic modelsEURASIP Journal on Audio, Speech, and Music Processing10.1186/s13636-016-0088-72016:1(1-19)Online publication date: 1-Dec-2016
  • (2009)Artificial neural network reduction through oracle learningIntelligent Data Analysis10.5555/1551758.155176513:1(135-149)Online publication date: 1-Jan-2009
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image Neural Processing Letters
Neural Processing Letters  Volume 12, Issue 3
Dec. 1, 2000
101 pages
ISSN:1370-4621
Issue’s Table of Contents

Publisher

Kluwer Academic Publishers

United States

Publication History

Published: 01 December 2000

Author Tags

  1. approximator
  2. bagging
  3. boosting
  4. ensemble of classifiers
  5. neural networks

Qualifiers

  • Article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 28 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2021)Does knowledge distillation really work?Proceedings of the 35th International Conference on Neural Information Processing Systems10.5555/3540261.3540790(6906-6919)Online publication date: 6-Dec-2021
  • (2016)Wise teachers train better DNN acoustic modelsEURASIP Journal on Audio, Speech, and Music Processing10.1186/s13636-016-0088-72016:1(1-19)Online publication date: 1-Dec-2016
  • (2009)Artificial neural network reduction through oracle learningIntelligent Data Analysis10.5555/1551758.155176513:1(135-149)Online publication date: 1-Jan-2009
  • (2007)Agent-Based Approach to Distributed Ensemble Learning of Fuzzy ARTMAP ClassifiersProceedings of the 1st KES International Symposium on Agent and Multi-Agent Systems: Technologies and Applications10.1007/978-3-540-72830-6_84(805-814)Online publication date: 31-May-2007
  • (2006)Model compressionProceedings of the 12th ACM SIGKDD international conference on Knowledge discovery and data mining10.1145/1150402.1150464(535-541)Online publication date: 20-Aug-2006
  • (2002)Multistage Neural Network EnsemblesProceedings of the Third International Workshop on Multiple Classifier Systems10.5555/648056.744253(91-97)Online publication date: 24-Jun-2002

View Options

View options

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media