Nothing Special   »   [go: up one dir, main page]

skip to main content
research-article

Recent advances in decision trees: an updated survey

Published: 10 October 2022 Publication History

Abstract

Decision Trees (DTs) are predictive models in supervised learning, known not only for their unquestionable utility in a wide range of applications but also for their interpretability and robustness. Research on the subject is still going strong after almost 60 years since its original inception, and in the last decade, several researchers have tackled key matters in the field. Although many great surveys have been published in the past, there is a gap since none covers the last decade of the field as a whole. This paper proposes a review of the main recent advances in DT research, focusing on three major goals of a predictive learner: issues regarding the fitting of training data, generalization, and interpretability. Moreover, by organizing several topics that have been previously analyzed in isolation, this survey attempts to provide an overview of the field, its key concerns, and future trends, serving as a good entry point for both researchers and newcomers to the machine learning community.

References

[1]
Adibi MA Single and multiple outputs decision tree classification using bi-level discrete-continues genetic algorithm Pattern Recognit Lett 2019 128 190-196
[2]
Aghaei S, Azizi MJ, Vayanos P (2019) Learning optimal and fair decision trees for non-discriminative decision-making. In: Proceedings of the AAAI conference on artificial intelligence, vol 33(01), pp 1418–1426.
[3]
Aglin G, Nijssen S, Schaus P (2020) Learning optimal decision trees using caching branch-and-bound search. In: Proceedings of the AAAI conference on artificial intelligence, vol 34(04), pp 3146–3153.
[4]
Alvarez-Melis D, Jaakkola TS (2018) On the robustness of interpretability methods. arXiv:1806.08049 [cs, stat]
[5]
Amodei D, Ananthanarayanan S, Anubhai R et al (2016) Deep speech 2: end-to-end speech recognition in English and Mandarin. In: International conference on machine learning, PMLR, pp 173–182
[6]
Angelino E, Larus-Stone N, Alabi D et al (2017) Learning certifiably optimal rule lists.
[7]
Avellaneda F (2020) Efficient inference of optimal decision trees. In: Proceedings of the AAAI conference on artificial intelligence, vol 34(04), pp 3195–3202.
[8]
Baranauskas JA The number of classes as a source for instability of decision tree algorithms in high dimensional datasets Artif Intell Rev 2015 43 2 301-310
[9]
Barros RC, Basgalupp MP, de Carvalho ACPLF, et al. A survey of evolutionary algorithms for decision-tree induction IEEE Trans Syst Man Cybern C 2012 42 3 291-312
[10]
Barros RC, de Carvalho ACPLF, Freitas AA (2015) Automatic design of decision-tree induction algorithms. Springer Briefs in computer science. Springer.
[11]
Bennett KP (1992) Decision tree construction via linear programming, Technical report. University of Wisconsin-Madison Department of Computer Sciences
[12]
Bennett KP, Blue JA (1996) Optimal decision trees. Technical report, R.P.I. Math Report No. 214. Rensselaer Polytechnic Institute
[13]
Bertsimas D and Dunn J Optimal classification trees Mach Learn 2017 106 7 1039-1082
[14]
Bertsimas D, Dunn J, Mundru N (2019) Optimal prescriptive trees 1(2):164–183.
[15]
Bessiere C, Hebrard E, O’Sullivan B (2009) Minimising decision tree size as combinatorial optimisation. In: Gent IP (ed) Principles and practice of constraint programming—CP 2009. Lecture notes in computer science, vol 5732. Springer, Berlin, pp 173–187.
[16]
Blanquero R, Carrizosa E, Molero-Río C et al (2020) Sparsity in optimal randomized classification trees. Eur J Oper Res 284(1):255–272. arXiv: 2002.09191
[17]
Blanquero R, Carrizosa E, Molero-Río C, et al. Optimal randomized classification trees Comput Oper Res 2021 132 105 281
[18]
Blockeel H, Raedt LD, Ramon J (1998) Top-down induction of clustering trees. In: Proceedings of the fifteenth international conference on machine learning, 1998, pp 55–63
[19]
Bojarski M, Del Testa D, Dworakowski D et al (2016) End to end learning for self-driving cars. arXiv preprint. arXiv:1604.07316
[20]
Breiman L and Friedman JH Tree-structured classification via generalized discriminant analysis: comment J Am Stat Assoc 1988 83 403 725-727
[21]
Breiman L, Friedman J, Stone CJ, et al. Classification and regression trees 1984 Boca Raton Taylor & Francis
[22]
Breslow LA and Aha DW Simplifying decision trees: a survey Knowl Eng Rev 1997 12 01 1-40
[23]
Brodley CE and Utgoff PE Multivariate decision trees Mach Learn 1995 19 1 45-77
[24]
Broelemann K, Kasneci G (2019) A gradient-based split criterion for highly accurate and transparent model trees. In: Proceedings of the twenty-eighth international joint conference on artificial intelligence, 2019, pp 2030–2037.
[25]
Brunello A, Marzano E, Montanari A, et al. Decision tree pruning via multi-objective evolutionary computation Int J Mach Learn Comput 2017 7 6 167-175
[26]
Cao-Van K and De Baets B Growing decision trees in an ordinal setting Int J Intell Syst 2003 18 7 733-750
[27]
Carreira-Perpinan MA, Hada SS (2021) Counterfactual explanations for oblique decision trees: exact, efficient algorithms. In: Proceedings of the AAAI conference on artificial intelligence, 2021, vol 35(8), pp 6903–6911
[28]
Carrizosa E, Molero-Río C, and Romero Morales D Mathematical optimization in classification and regression trees TOP 2021 29 1 5-33
[29]
Chabbouh M, Bechikh S, Hung CC, et al. Multi-objective evolution of oblique decision trees for imbalanced data binary classification Swarm Evol Comput 2019 49 1-22
[30]
Chen YL, Wu CC, and Tang K Time-constrained cost-sensitive decision tree induction Inf Sci 2016 354 140-152
[31]
Clemmensen L, Hastie T, Witten D, et al. Sparse discriminant analysis Technometrics 2011 53 4 406-413
[32]
Correa Bahnsen A, Aouada D, and Ottersten B Example-dependent cost-sensitive decision trees Expert Syst Appl 2015 42 19 6609-6619
[33]
Czajkowski M and Kretowski M The role of decision tree representation in regression problems—an evolutionary perspective Appl Soft Comput 2016 48 458-475
[34]
Czajkowski M, Jurczuk K, Kretowski M (2015) A parallel approach for evolutionary induced decision trees. MPI+OpenMP implementation. In: Rutkowski L, Korytkowski M, Scherer R et al (eds) Artificial intelligence and soft computing. Lecture notes in computer science, vol 9119. Springer, Cham, pp 340–349.
[35]
Demirović E, Stuckey PJ (2021) Optimal decision trees for nonlinear metrics. In: Proceedings of the AAAI conference on artificial intelligence, 2021, vol 35(5), pp 3733–3741
[36]
Demirović E, Lukina A, Hebrard E et al (2021) MurTree: optimal classification trees via dynamic programming and search. arXiv:2007.12652 [cs, stat] ArXiv: 2007.12652
[37]
Dunn JW (2018) Optimal trees for prediction and prescription. PhD Thesis, Massachusetts Institute of Technology
[38]
Elsisi M, Mahmoud K, Lehtonen M, et al. Reliable industry 4.0 based on machine learning and IoT for analyzing, monitoring, and securing smart meters Sensors 2021 21 2 487
[39]
Esmeir S and Markovitch S Anytime learning of decision trees J Mach Learn Res 2007 8 891-933
[40]
Firat M, Crognier G, Gabor AF, et al. Column generation based heuristic for learning classification trees Comput Oper Res 2020 116 104 866
[41]
Fraiman R, Ghattas B, and Svarc M Interpretable clustering using unsupervised binary trees Adv Data Anal Classif 2013 7 2 125-145
[42]
Frank E, Mayo M, Kramer S (2015) Alternating model trees. In: Proceedings of the 30th annual ACM symposium on applied computing, Salamanca, Spain. ACM, pp 871–878.
[43]
Freitas AA Comprehensible classification models: a position paper ACM SIGKDD Explor Newsl 2014 15 1 1-10
[44]
Freund Y, Mason L (1999) The alternating decision tree learning algorithm. In: Proceedings of the sixteenth international conference on machine learning, 1999, pp 124–133
[45]
Frosst N, Hinton G (2017) Distilling a neural network into a soft decision tree. arXiv:1711.09784 [cs, stat]
[46]
Garcia Leiva R, Fernandez Anta A, Mancuso V, et al. A novel hyperparameter-free approach to decision tree construction that avoids overfitting by design IEEE Access 2019 7 99978-99987
[47]
Ghattas B, Michel P, and Boyer L Clustering nominal data using unsupervised binary decision trees: comparisons with the state of the art methods Pattern Recognit 2017 67 177-185
[48]
Gleser MA and Collen MF Towards automated medical decisions Comput Biomed Res 1972 5 2 180-189
[49]
Günlük O, Kalagnanam J, Li M, et al. Optimal decision trees for categorical data via integer programming J Glob Optim 2021
[50]
Hastie T, Tibshirani R, Friedman JH, et al. The elements of statistical learning: data mining, inference, and prediction 2009 New York Springer
[51]
Heath D, Kasif S, and Salzberg S Induction of oblique decision trees J Artif Intell Res 1993 1993 1002-1007
[52]
Hehn TM, Kooij JFP, and Hamprecht FA End-to-end learning of decision trees and forests Int J Comput Vis 2020 128 4 997-1011
[53]
Hu Q, Guo M, Yu D, et al. Information entropy for ordinal classification Sci China Inf Sci 2010 53 6 1188-1200
[54]
Hu Q, Che X, Zhang L, et al. Rank entropy-based decision trees for monotonic classification IEEE Trans Knowl Data Eng 2012 24 11 2052-2064
[55]
Hu X, Rudin C, Seltzer M (2019) Optimal sparse decision trees. In: Advances in neural information processing systems (NeurIPS)
[56]
Hu H, Siala M, Hebrard E, et al (2020) Learning optimal decision trees with MaxSAT and its integration in AdaBoost. In: Proceedings of the twenty-ninth international joint conference on artificial intelligence, pp 1170–1176. ISSN: 1045-0823.
[57]
Huang GB, Zhu QY, and Siew CK Extreme learning machine: theory and applications Neurocomputing 2006 70 1–3 489-501
[58]
Hwang S, Yeo HG, and Hong JS A new splitting criterion for better interpretable trees IEEE Access 2020 8 62762-62774
[59]
Hyafil L and Rivest RL Constructing optimal binary decision trees is NP-complete Inf Process Lett 1976 5 1 15-17
[60]
Ikonomovska E, Gama J, and Džeroski S Learning model trees from evolving data streams Data Min Knowl Discov 2011 23 1 128-168
[61]
Iorio C, Aria M, D’Ambrosio A, et al. Informative trees by visual pruning Expert Syst Appl 2019 127 228-240
[62]
Irsoy O, Yıldız OT, Alpaydın E (2012) Soft decision trees. In: Proceedings of the 21st international conference on pattern recognition (ICPR2012), 2012, pp 1819–1822
[63]
Irsoy O, Yildiz OT, Alpaydin E (2014) Budding trees. In: 2014 22nd international conference on pattern recognition, Stockholm, Sweden, 2014. IEEE, pp 3582–3587.
[64]
Janikow C Fuzzy decision trees: issues and methods IEEE Trans Syst Man Cybern B 1998 28 1 1-14
[65]
Janota M, Morgado A (2020) SAT-based encodings for optimal decision trees with explicit paths. In: Theory and applications of satisfiability testing—SAT 12178, pp 501–518.
[66]
Johansson U, Linusson H, Löfström T, et al. Interpretable regression trees using conformal prediction Expert Syst Appl 2018 97 394-404
[67]
Jordan MI and Jacobs RA Hierarchical mixtures of experts and the EM algorithm Neural Comput 1994 6 2 181-214
[68]
Jurczuk K, Czajkowski M, and Kretowski M Evolutionary induction of a decision tree for large-scale data: a GPU-based approach Soft Comput 2017 21 24 7363-7379
[69]
Karabadji NEI, Seridi H, Bousetouane F, et al. An evolutionary scheme for decision tree construction Knowl Based Syst 2017 119 166-177
[70]
Kim K A hybrid classification algorithm by subspace partitioning through semi-supervised decision tree Pattern Recognit 2016 60 157-163
[71]
Kim H and Loh WY Classification trees with unbiased multiway splits J Am Stat Assoc 2001 96 454 589-604
[72]
Kohavi R (1996) Scaling up the accuracy of naive-Bayes classifiers: a decision-tree hybrid. In: Proceedings of the second international conference on knowledge discovery and data mining. KDD’96, 1996. AAAI Press, pp 202–207
[73]
Kotsiantis SB Decision trees: a recent overview Artif Intell Rev 2013 39 4 261-283
[74]
Kretowski M and Grzes M Evolutionary induction of mixed decision trees IJDWM 2007 3 68-82
[75]
Landwehr N, Hall M, and Frank E Logistic model trees Mach Learn 2005 59 1 161-205
[76]
Levatić J, Ceci M, Kocev D, et al. Semi-supervised classification trees J Intell Inf Syst 2017 49 3 461-486
[77]
Levatić J, Kocev D, Ceci M, et al. Semi-supervised trees for multi-target regression Inf Sci 2018 450 109-127
[78]
Li RH, Belford GG (2002) Instability of decision tree classification algorithms. In: Proceedings of the eighth ACM SIGKDD international conference on knowledge discovery and data mining, KDD ’02, New York, NY, USA. Association for Computing Machinery, pp 570–575.
[79]
Li X, Zhao H, and Zhu W A cost sensitive decision tree algorithm with two adaptive mechanisms Knowl Based Syst 2015 88 24-33
[80]
Li J, Ma S, Le T, et al. Causal decision trees IEEE Trans Knowl Data Eng 2017 29 2 257-271
[81]
Lin J, Zhong C, Hu D et al (2020) Generalized and scalable optimal sparse decision trees. In: Proceedings of the 37th international conference on machine learning, 2020. PMLR, pp 6150–6160. ISSN: 2640-3498
[82]
Lipton ZC The mythos of model interpretability: in machine learning, the concept of interpretability is both important and slippery Queue 2018 16 3 31-57
[83]
Liu B, Xia Y, Yu PS (2000) Clustering through decision tree construction. In: Proceedings of the ninth international conference on information and knowledge management, CIKM ’00, New York, NY, USA, 2000. Association for Computing Machinery, pp 20–29.
[84]
Loh WY Improving the precision of classification trees Ann Appl Stat 2009
[85]
Loh WY Classification and regression trees Wiley Interdiscip Rev Data Min Knowl Discov 2011 1 1 14-23
[86]
Loh WY Fifty years of classification and regression trees Int Stat Rev 2014 82 3 329-348
[87]
Loh WY and Shih YS Split selection methods for classification trees Stat Sin 1997 7 4 815-840
[88]
Loh WY and Vanichsetakul N Tree-structured classification via generalized discriminant analysis J Am Stat Assoc 1988 83 403 715-725
[89]
Lomax S and Vadera S A survey of cost-sensitive decision tree induction algorithms ACM Comput Surv (CSUR) 2013
[90]
López-Chau A, Cervantes J, López-García L, et al. Fisher’s decision tree Expert Syst Appl 2013 40 16 6283-6291
[91]
Manwani N and Sastry PS Geometric decision tree IEEE Trans Syst Man Cybern B 2012 42 1 181-192
[92]
Marsala C and Petturiti D Rank discrimination measures for enforcing monotonicity in decision tree induction Inf Sci 2015 291 143-171
[93]
Meisel W and Michalopoulos D A partitioning algorithm with application in pattern classification and the optimization of decision trees IEEE Trans Comput 1973 C–22 1 93-103
[94]
Mingers J An empirical comparison of pruning methods for decision tree induction Mach Learn 1989 4 2 227-243
[95]
Mitchell M An introduction to genetic algorithms 1998 Cambridge MIT Press
[96]
Molnar C (2022) Interpretable machine learning, 2nd edn. christophm.github.io/interpretable-ml-book/
[97]
Morgan JN and Sonquist JA Problems in the analysis of survey data, and a proposal J Am Stat Assoc 1963 58 302 415-434
[98]
Mu Y, Liu X, Wang L, et al. A parallel fuzzy rule-base based decision tree in the framework of Map-Reduce Pattern Recognit 2020 103 107 326
[99]
Murthy SK Automatic construction of decision trees from data: a multi-disciplinary survey Data Min Knowl Discov 1998 2 4 345-389
[100]
Murthy S, Salzberg S (1995a) Lookahead and pathology in decision tree induction. In: Proceedings of the 14th international joint conference on artificial intelligence, IJCAI’95, vol 2. Morgan Kaufmann Publishers Inc., San Francisco, pp 1025–1031
[101]
Murthy SK, Salzberg S (1995b) Decision tree induction: how effective is the greedy heuristic? p 6
[102]
Murthy S, Kasif S, Salzberg S et al (1993) OC1: a randomized induction of oblique decision trees. In: AAAI, Citeseer, pp 322–327
[103]
Narodytska N, Ignatiev A, Pereira F et al (2018) Learning optimal decision trees with SAT. In: Proceedings of the twenty-seventh international joint conference on artificial intelligence. International Joint Conferences on Artificial Intelligence Organization, Stockholm, pp 1362–1368.
[104]
Nijssen S and Fromont E Optimal constraint-based decision tree induction from itemset lattices Data Min Knowl Discov 2010 21 1 9-51
[105]
Norouzi M, Collins M, Johnson MA et al (2015) Efficient non-greedy optimization of decision trees. In: Advances in neural information processing systems, vol 28. Curran Associates, Inc., Red Hook
[106]
Norton SW (1989) Generating better decision trees. In: IJCAI, pp 800–805
[107]
Nunes C, De Craene M, Langet H, et al. Learning decision trees through Monte Carlo tree search: an empirical evaluation WIREs Data Min Knowl Discov 2020
[108]
Paez A, López F, Ruiz M, et al. Inducing non-orthogonal and non-linear decision boundaries in decision trees via interactive basis functions Expert Syst Appl 2019 122 183-206
[109]
Pei S, Hu Q, and Chen C Multivariate decision trees with monotonicity constraints Knowl Based Syst 2016 112 14-25
[110]
Piltaver R, Luštrek M, Gams M, et al. What makes classification trees comprehensible? Expert Syst Appl 2016 62 333-346
[111]
Potharst R, Bioch JC (1999) A decision tree algorithm for ordinal classification. In: Goos G, Hartmanis J, van Leeuwen J et al (eds) Advances in intelligent data analysis. Lecture notes in computer science, vol 1642. Springer, Berlin, pp 187–198.
[112]
Provost F and Domingos P Tree induction for probability-based ranking Mach Learn 2003 52 3 199-215
[113]
Quinlan JR Induction of decision trees Mach Learn 1986 1 1 81-106
[114]
Quinlan JR Simplifying decision trees Int J Man–Mach Stud 1987 27 3 221-234
[115]
Quinlan JR (1992) Learning with continuous classes. In: 5th Australian joint conference on artificial intelligence. World Scientific, pp 343–348
[116]
Ragavan H, Rendell LA (1993) Lookahead feature construction for learning hard concepts. In: Proceedings of the tenth international conference on international conference on machine learning, ICML’93, 1993. Morgan Kaufmann Publishers, Inc., San Francisco, pp 252–259
[117]
Rhuggenaath J, Zhang Y, Akcay A et al (2018) Learning fuzzy decision trees using integer programming. In: 2018 IEEE international conference on fuzzy systems (FUZZ-IEEE), pp 1–8.
[118]
Rokach L and Maimon OZ Data mining with decision trees: theory and applications 2007 Singapore World Scientific
[119]
Roscher R, Bohn B, Duarte MF, et al. Explainable machine learning for scientific insights and discoveries IEEE Access 2020 8 42200-42216
[120]
Rudin C Stop explaining black box machine learning models for high stakes decisions and use interpretable models instead Nat Mach Intell 2019 1 5 206-215
[121]
Rusch T and Zeileis A Discussion on fifty years of classification and regression trees Int Stat Rev 2014 82 3 361-367
[122]
Sarker IH Machine learning: algorithms, real-world applications and research directions SN Comput Sci 2021 2 3 1-21
[123]
Schidler A, Szeider S (2021) SAT-based decision tree learning for large data sets. In: Proceedings of the AAAI conference on artificial intelligence, vol 35(5), pp 3904–3912
[124]
Silva A, Gombolay M, Killian T et al (2020) Optimization methods for interpretable differentiable decision trees applied to reinforcement learning. In: Proceedings of the twenty third international conference on artificial intelligence and statistics, 2020. PMLR, pp 1855–1865. ISSN: 2640-3498
[125]
Silver D, Huang A, Maddison CJ, et al. Mastering the game of go with deep neural networks and tree search Nature 2016 529 7587 484-489
[126]
Sok HK, Ooi MPL, and Kuang YC Sparse alternating decision tree Pattern Recognit Lett 2015 60–61 57-64
[127]
Sok HK, Ooi MPL, Kuang YC, et al. Multivariate alternating decision trees Pattern Recognit 2016 50 195-209
[128]
Sosnowski ZA and Gadomer Lu Fuzzy trees and forests—review Wiley Interdiscip Rev Data Min Knowl Discov 2019
[129]
Suarez A and Lutsko J Globally optimal fuzzy decision trees for classification and regression IEEE Trans Pattern Anal Mach Intell 1999 21 12 1297-1311
[130]
Tanha J, van Someren M, and Afsarmanesh H Semi-supervised self-training for decision tree classifiers Int J Mach Learn Cybern 2017 8 1 355-370
[131]
Tanno R, Arulkumaran K, Alexander D et al (2019) Adaptive neural trees. In: Proceedings of the 36th international conference on machine learning, 2019. PMLR, pp 6166–6175. ISSN: 2640-3498
[132]
Tharwat A, Gaber T, Ibrahim A, et al. Linear discriminant analysis: a detailed tutorial AI Commun 2017 30 2 169-190
[133]
Tran MQ, Elsisi M, Mahmoud K, et al. Experimental setup for online fault diagnosis of induction machines via promising IoT and machine learning: towards industry 4.0 empowerment IEEE Access 2021 9 115429-115441
[134]
Verwer S, Zhang Y (2017) Learning decision trees with flexible constraints and objectives using integer optimization. In: Salvagnin D, Lombardi M (eds) Integration of AI and OR techniques in constraint programming. Lecture notes in computer science, vol 10335. Springer, Cham, pp 94–103.
[135]
Verwer S, Zhang Y (2019) Learning optimal classification trees using a binary linear program formulation. In: Proceedings of the AAAI conference on artificial intelligence, vol 33(01), pp 1625–1632.
[136]
Wan A, Dunlap L, Ho D et al (2020) NBDT: neural-backed decision trees. arXiv:2004.00221
[137]
Wang R, Kwong S, Wang XZ, et al. Segment based decision tree induction with continuous valued attributes IEEE Trans Cybern 2014 45 7 1262-1275
[138]
Wang J, Fujimaki R, Motohashi Y (2015a) Trading interpretability for accuracy: oblique treed sparse additive models. In: Proceedings of the 21st ACM SIGKDD international conference on knowledge discovery and data mining, 2015, pp 1245–1254
[139]
Wang R, He YL, Chow CY, et al. Learning ELM-Tree from big data based on uncertainty reduction Fuzzy Sets Syst 2015 258 79-100
[140]
Wang X, Liu X, Pedrycz W, et al. Fuzzy rule based decision trees Pattern Recognit 2015 48 1 50-59
[141]
Webb GI (1997) Decision tree grafting. In: Proceedings of the fifteenth international joint conference on artificial intelligence, IJCAI’97, vol 2. Morgan Kaufmann Publishers, Inc., San Francisco, pp 846–851
[142]
Wickramarachchi D, Robertson B, Reale M, et al. HHCART: an oblique decision tree Comput Stat Data Anal 2016 96 12-23
[143]
Wickramarachchi DC, Robertson BL, Reale M, et al. A reflected feature space for CART Aust NZ J Stat 2019 61 3 380-391
[144]
Wu CC, Chen YL, Liu YH, et al. Decision tree induction with a constrained number of leaf nodes Appl Intell 2016 45 3 673-685
[145]
Wu CC, Chen YL, and Tang K Cost-sensitive decision tree with multiple resource constraints Appl Intell 2019 49 10 3765-3782
[146]
Yan J, Zhang Z, Xie L, et al. A unified framework for decision tree on continuous attributes IEEE Access 2019 7 11924-11933
[147]
Yang L, Liu S, Tsoka S, et al. A regression tree approach using mathematical programming Expert Syst Appl 2017 78 347-357
[148]
Yang Y, Morillo IG, Hospedales TM (2018) Deep neural decision trees. arXiv:1806.06988 [cs, stat]
[149]
Yuan Y and Shaw MJ Induction of fuzzy decision trees Fuzzy Sets Syst 1995 69 2 125-139
[150]
Zhao H and Li X A cost sensitive decision tree algorithm based on weighted class distribution with batch deleting attribute mechanism Inf Sci 2017 378 303-316
[151]
Zhou X and Yan D Model tree pruning Int J Mach Learn Cybern 2019 10 12 3431-3444
[152]
Zhu H, Murali P, Phan D, et al., et al. Larochelle H, Ranzato M, Hadsell R, et al., et al. A scalable MIP-based method for learning optimal multivariate decision trees Advances in neural information processing systems 2020 Red Hook Curran Associates, Inc. 1771-1781

Cited By

View all
  • (2024)Enhancing Education and Teaching Management Through Data Mining and Support Vector Machine AlgorithmInternational Journal of e-Collaboration10.4018/IJeC.35799820:1(1-16)Online publication date: 7-Nov-2024
  • (2024)Decision Tree Algorithm Based Health Analysis System for Big DataProceedings of the 2024 4th International Conference on Artificial Intelligence, Automation and High Performance Computing10.1145/3690931.3690969(220-225)Online publication date: 19-Jul-2024
  • (2024)A decision-making tool for the determination of the distribution center location in a humanitarian logistics networkExpert Systems with Applications: An International Journal10.1016/j.eswa.2023.122010238:PCOnline publication date: 27-Feb-2024
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image Artificial Intelligence Review
Artificial Intelligence Review  Volume 56, Issue 5
May 2023
990 pages

Publisher

Kluwer Academic Publishers

United States

Publication History

Published: 10 October 2022

Author Tags

  1. Decision trees
  2. Machine learning
  3. Interpretable models
  4. Classification algorithms

Qualifiers

  • Research-article

Funding Sources

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 14 Dec 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Enhancing Education and Teaching Management Through Data Mining and Support Vector Machine AlgorithmInternational Journal of e-Collaboration10.4018/IJeC.35799820:1(1-16)Online publication date: 7-Nov-2024
  • (2024)Decision Tree Algorithm Based Health Analysis System for Big DataProceedings of the 2024 4th International Conference on Artificial Intelligence, Automation and High Performance Computing10.1145/3690931.3690969(220-225)Online publication date: 19-Jul-2024
  • (2024)A decision-making tool for the determination of the distribution center location in a humanitarian logistics networkExpert Systems with Applications: An International Journal10.1016/j.eswa.2023.122010238:PCOnline publication date: 27-Feb-2024
  • (2023)Necessary and sufficient conditions for optimal decision trees using dynamic programmingProceedings of the 37th International Conference on Neural Information Processing Systems10.5555/3666122.3666526(9173-9212)Online publication date: 10-Dec-2023
  • (2023)WCDForest: a weighted cascade deep forest model toward the classification tasksApplied Intelligence10.1007/s10489-023-04794-z53:23(29169-29182)Online publication date: 1-Dec-2023
  • (2023)Formal XAI via Syntax-Guided SynthesisBridging the Gap Between AI and Reality10.1007/978-3-031-46002-9_7(119-137)Online publication date: 23-Oct-2023

View Options

View options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media