Abstract
In this paper, a hybrid decision forest is constructed by double randomization of the original training set. In this decision forest, each individual base decision tree classifiers are incorporated with an additional classifier model, the Logitboosted decision stump. In the first randomization, the resamples to train the decision trees are extracted; in the second randomization, second set of resamples are generated from the out-of-bag samples of the first set of resamples. The boosted decision stumps are constructed on the second resamples. These extra resamples along with the resamples on which the base tree classifiers are trained, approximates the original training set. In this way we are utilizing the full training set to construct a hybrid decision forest with larger feature space. We have applied this hybrid decision forest in two real world applications; a) classifying credit scores, and b) short term extreme rainfall forecast. The performance of the hybrid decision forest in these two problems are compared with some well known machine learning methods. Overall results suggest that the new hybrid decision forest is capable of yielding commendable predictive performance.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Breiman, L.: Bagging Predictors. Machine Learning 24(2), 123–140 (1996)
Breiman, L.: Out-of-bag estimation. Tech. Rep. 2 (1996)
Breiman, L.: Random Forests. Machine Learning 45(1), 5–32 (2001)
De Bock, K.W., Coussement, K., Van den Poel, D.: Ensemble classification based on generalized additive models. Computational Statistics & Data Analysis 54(6), 1535–1546 (2010)
Dettling, M., Buhlmann, P.: Boosting for tumor classification with gene expression data (June 2003)
Fawcett, T.: An introduction to ROC analysis. Pattern Recognition Letters 27(8), 861–874 (2006)
Fawcett, T.: ROC Graphs: Notes and Practical Considerations for Researchers. ReCALL 31(HPL-2003-4), 1–38 (2004)
Frank, A., Asuncion, A.: UCI Machine Learning Repository (2010), http://archive.ics.uci.edu/ml
Freund, Y., Schapire, R.E.: A Decision-Theoretic Generalization of On-Line Learning and an Application to Boosting. In: Vitányi, P.M.B. (ed.) EuroCOLT 1995. LNCS, vol. 904, pp. 23–37. Springer, Heidelberg (1995)
Friedman, J., Hastie, T., Tibshirani, R.: Additive logistic regression: a statistical view of boosting. Annals of Statistics 28(2), 337–407 (2000)
Hastie, T., Tibshirani, R., Friedman, J.: The elements of statistical learning: data mining, inference and prediction, 2nd edn. Springer, Heidelberg (2009)
Hogan, R.J., O’Connor, E.J., Illingworth, A.J.: Verification of cloud-fraction forecasts. Quarterly Journal of the Royal Meteorological Society 135(643), 1494–1511 (2009)
Hothorn, T., Lausen, B.: Double-bagging: combining classifiers by bootstrap aggregation. Pattern Recognition 36(6), 1303–1309 (2003)
Suykens, J.A.K., Vandewalle, J.: Least Squares Support Vector Machine Classifiers. Neural Processing Letters, 293–300 (1999)
Kuncheva, L.I.: Combining Pattern Classifiers: Methods and Algorithms, 1st edn. Wiley-Interscience (2004)
Mason, I.: A model for assessment of weather forecasts. Australian Metereological Magazine 30, 291–303 (1982)
Nanni, L., Lumini, A.: An experimental comparison of ensemble of classifiers for bankruptcy prediction and credit scoring. Expert Systems with Applications 36(2), 3028–3033 (2009)
Polikar, R.: Ensemble based systems in decision making. IEEE Circuits And Systems Magazine Circuits And Systems Magazine 6(3), 21–45 (2006)
Rodríguez, J.J., Kuncheva, L.I., Alonso, C.J.: Rotation forest: A new classifier ensemble method. IEEE Transactions on Pattern Analysis and Machine Intelligence 28(10), 1619–1630 (2006)
Rokach, L.: Pattern classification using ensemble methods. World Scientific Publishing (2010)
Schapire, R.E.: The Boosting Approach to Machine Learning An Overview. In: MSRI Workshop on Nonlinear Estimation and Classification, vol. 7(4), pp. 1–23 (2003)
Stephenson, D.B., Casati, B., Ferro, C.A.T., Wilson, C.A.: The extreme dependency score: a non-vanishing measure for forecasts of rare events. Meteorological Applications 15(1), 41–50 (2008)
Stephenson, D.: Use of the “Odds Ratio” for Diagnosing Forecast Skill. Weather Forecasting 15(2), 221–232 (2000)
Tsai, C., Wu, J.: Using neural network ensembles for bankruptcy prediction and credit scoring. Expert Systems with Applications 34(4), 2639–2649 (2008)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2012 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Faisal, Z.M., Monira, S.S., Hirose, H. (2012). DRFLogitBoost: A Double Randomized Decision Forest Incorporated with LogitBoosted Decision Stumps. In: Pan, JS., Chen, SM., Nguyen, N.T. (eds) Intelligent Information and Database Systems. ACIIDS 2012. Lecture Notes in Computer Science(), vol 7196. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-28487-8_30
Download citation
DOI: https://doi.org/10.1007/978-3-642-28487-8_30
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-28486-1
Online ISBN: 978-3-642-28487-8
eBook Packages: Computer ScienceComputer Science (R0)