Nothing Special   »   [go: up one dir, main page]

skip to main content
research-article

Batch-mode active ordinal classification based on expected model output change and leadership tree: Batch-mode active ordinal classification based on expected model...

Published: 04 January 2025 Publication History

Abstract

While numerous batch-mode active learning (BMAL) methods have been developed for nominal classification, the absence of a BMAL method tailored for ordinal classification is conspicuous. This paper focuses on proposing an effective BMAL method for ordinal classification and argues that a BMAL method should guarantee that the selected instances in each iteration are highly informative, diverse from labeled instances, and diverse from each other. We first introduce an expected model output change criterion based on the kernel extreme learning machine-based ordinal classification model and demonstrate that the criterion is a composite containing both informativeness assessment and diversity assessment. Selecting instances with high scores of this criterion can ensure that the selected are highly informative and diverse from labeled instances. To ensure that the selected instances are diverse from each other, we propose a leadership tree-based batch instance selection approach, drawing inspiration from density peak clustering algorithm. Thus, our BMAL method can select a batch of peak-scoring points from different high-scoring regions in each iteration. The effectiveness of the proposed method is empirically examined through comparisons with several state-of-the-art BMAL methods.

References

[1]
Gutiérrez PA, Pérez-Ortiz M, Sánchez-Monedero J, Fernández-Navarro F, and Hervás-Martínez C Ordinal regression methods: survey and experimental study IEEE Trans Knowl Data Eng 2016 28 1 127-146
[2]
Shi Y, Li P, Yuan H, Miao J, and Niu L Fast kernel extreme learning machine for ordinal regression Knowl Based Syst 2019 177 44-54
[3]
He D Active learning for ordinal classification based on expected cost minimization Sci Rep 2022 12 1 22468
[4]
Kumar P and Gupta A Active learning query strategies for classification, regression, and clustering: a survey J Comput Sci Technol 2020 35 4 913-945
[5]
Riccardi A, Fernández-Navarro F, and Carloni S Cost-sensitive adaboost algorithm for ordinal regression based on extreme learning machine IEEE Trans Cybern 2014 44 10 1898-1909
[6]
Freytag A, Rodner E, Denzler J (2014) Selecting influential examples: active learning with expected model output changes. In: Proceedings of the 13th european conference on computer vision, vol 8692. Springer, Zurich, Switzerland, pp 562–577.
[7]
Rodriguez A and Laio A Clustering by fast search and find of density peaks Science 2014 344 6191 1492-1496
[8]
Scheffer T, Decomain C, Wrobel S (2001) Active hidden markov models for information extraction. In: Proceedings of the 4th International conference on intelligent data analysis, vol. 2189. Springer, Cascais, Portugal, pp 309–318.
[9]
Culotta A, McCallum A (2005) Reducing labeling effort for structured prediction tasks. In: Proceedings of the twentieth national conference on artificial intelligence and the seventeenth innovative applications of artificial intelligence conference. AAAI Press / The MIT Press, Pittsburgh, Pennsylvania, USA, pp 746–751
[10]
Jing F, Li M, Zhang H, Zhang B (2004) Entropy-based active learning with support vector machines for content-based image retrieval. In: Proceedings of the 2004 IEEE International conference on multimedia and expo. Taipei, Taiwan, pp 85–88.
[11]
Vandoni J, Aldea E, and Hégarat-Mascle SL Evidential query-by-committee active learning for pedestrian detection in high-density crowds Int J Approx Reason 2019 104 166-184
[12]
Roy N, McCallum A (2001) Toward optimal active learning through sampling estimation of error reduction. In: Proceedings of the eighteenth international conference on machine learning. Morgan Kaufmann, Williamstown, MA, USA, pp 441–448
[13]
Cai W, Zhang M, and Zhang Y Batch mode active learning for regression with expected model change IEEE Trans Neural Networks Learn Syst 2017 28 7 1668-1681
[14]
Park SH and Kim SB Robust expected model change for active learning in regression Appl Intell 2020 50 2 296-313
[15]
Miller K and Bertozzi AL Model-change active learning in graph-based semi-supervised learning Commun Appl Math Comput 2024
[16]
Käding C, Freytag A, Rodner E, Perino A, Denzler J (2016) Large-scale active learning with approximations of expected model output changes. In: Proceedings of the 38th German Conference on Pattern Recognition, vol. 9796. Hannover, Germany, pp 179–191.
[17]
Cao X (2020) A divide-and-conquer approach to geometric sampling for active learning. Expert Syst Appl 140.
[18]
Wang X, Huang Y, Liu J, Huang H (2018) New balanced active learning model and optimization algorithm. In: Proceedings of the twenty-seventh international joint conference on artificial intelligence. Stockholm, Sweden, pp 2826–2832.
[19]
Li C, Mao K, Liang L, Ren D, Zhang W, Yuan Y, Wang G (2021) Unsupervised active learning via subspace learning. In: Proceedings of the AAAI conference on artificial intelligence, Virtual Event, pp 8332–8339.
[20]
Wu D, Lin C, and Huang J Active learning for regression using greedy sampling Inf Sci 2019 474 90-105
[21]
Wang Z, Fang X, Tang X, and Wu C Multi-class active learning by integrating uncertainty and diversity IEEE Access 2018 6 22794-22803
[22]
Park SH and Kim SB Active semi-supervised learning with multiple complementary information Expert Syst Appl 2019 126 30-40
[23]
Hoi SCH, Jin R, and Lyu MR Batch mode active learning with applications to text categorization and image retrieval IEEE Trans Knowl Data Eng 2009 21 9 1233-1248
[24]
Sener O, Savarese S (2018) Active learning for convolutional neural networks: A core-set approach. In: Proceedings of the 6th international conference on learning representations. OpenReview.net, Vancouver, BC, Canada
[25]
Yang Y, Ma Z, Nie F, Chang X, and Hauptmann AG Multi-class active learning by uncertainty sampling with diversity maximization Int J Comput Vis 2015 113 2 113-127
[26]
Cardoso TNC, Silva RM, Canuto SD, Moro MM, and Gonçalves MA Ranked batch-mode active learning Inf Sci 2017 379 313-337
[27]
Wang Z and Ye J Querying discriminative and representative samples for batch mode active learning ACM Trans Knowl Discov Data 2015 9 3 1-23
[28]
Wang Z, Du B, Zhang L, and Zhang L A batch-mode active learning framework by querying discriminative and representative samples for hyperspectral image classification Neurocomputing 2016 179 88-100
[29]
Li H, Wang Y, Li Y, Xiao G, Hu P, and Zhao R Batch mode active learning via adaptive criteria weights Appl Intell 2021 51 6 3475-3489
[30]
Kirsch A, Amersfoort J, Gal, Y (2019) Batchbald: Efficient and diverse batch acquisition for deep bayesian active learning. In: Proceedings of the annual conference on neural information processing systems, Vancouver, BC, Canada, pp 7024–7035
[31]
Benkert R, Prabhushankar M, AlRegib G, Pacharmi A, and Corona E Gaussian switch sampling: a second-order approach to active learning IEEE Trans Artif Intell 2024 5 1 38-50
[32]
Ash JT, Zhang C, Krishnamurthy A, Langford J, Agarwal A (2020) Deep batch active learning by diverse, uncertain gradient lower bounds. In: Proceedings of the 8th international conference on learning representations. OpenReview.net, Addis Ababa, Ethiopia
[33]
Jin Q, Yuan M, Qiao Q, and Song Z One-shot active learning for image segmentation via contrastive learning and diversity-based sampling Knowl Based Syst 2022 241 108278
[34]
Citovsky G, DeSalvo G, Gentile C, Karydas L, Rajagopalan A, Rostamizadeh A, Kumar S (2021) Batch active learning at scale. In: Proceedings of the annual conference on neural information processing systems, virtual, pp 11933–11944
[35]
Lin H and Li L Reduction from cost-sensitive ordinal ranking to weighted binary classification Neural Comput 2012 24 5 1329-1367
[36]
Huang G, Zhou H, Ding X, Zhang R (2012) Extreme learning machine for regression and multiclass classification. IEEE Trans Syst Man Cybern Part B 42(2):513–529.
[37]
Xu J, Wang G, and Deng W Denpehc: density peak based efficient hierarchical clustering Inf Sci 2016 373 200-218
[38]
Wang M, Min F, Zhang Z, and Wu Y Active learning through density clustering Expert Syst Appl 2017 85 305-317
[39]
He D, Yu H, Wang G, and Li J A two-stage clustering-based cold-start method for active learning Intell Data Anal 2021 25 5 1169-1185
[40]
Hager WW Updating the inverse of a matrix SIAM Rev 1989 31 2 221-239
[41]
Chen P, Lin H (2013) Active learning for multiclass cost-sensitive classification using probabilistic models. In: Proceedings of the conference on technologies and applications of artificial intelligence (TAAI), pp. 13–18. IEEE Computer Society, Taipei, China
[42]
Schulz E, Speekenbrink M, and Krause A A tutorial on gaussian process regression: modelling, exploring, and exploiting functions J Math Psychol 2018 85 1-16
[43]
Jain AK, Nandakumar K, and Ross A Score normalization in multimodal biometric systems Pattern Recognit 2005 38 12 2270-2285
[44]
Lichman M (2013) UCI machine learning repository. University of California, School of Information and Computer Science. https://archive.ics.uci.edu/datasets/
[45]
Lin H.-T, Li L (2005) Novel distance-based svm kernels for infinite ensemble learning. In: Proceedings of the international conference on neural information processing, Taipei, China, pp 761–766
[46]
Lin H and Li L Support vector machinery for infinite ensemble learning J Mach Learn Res 2008 9 285-312
[47]
Li L, Lin HT (2006) Ordinal regression by extended binary classification. In: Proceedings of the twentieth annual conference on neural information processing systems, pp 865–872. MIT Press, Vancouver, Canada
[48]
Zhang T, Hao G, Lim M, Gu F, and Wang X A deep hybrid transfer learning-based evolutionary algorithm and its application in the optimization of high-order problems Soft Comput 2023 27 14 9661-9672
[49]
Pupo OGR, Altalhi AH, and Ventura S Statistical comparisons of active learning strategies over multiple datasets Knowl Based Syst 2018 145 274-288
[50]
Wilcoxon F Individual comparisons by ranking methods Biometrics Bulletin 1945 6 80-83

Index Terms

  1. Batch-mode active ordinal classification based on expected model output change and leadership tree: Batch-mode active ordinal classification based on expected model...
            Index terms have been assigned to the content through auto-classification.

            Recommendations

            Comments

            Please enable JavaScript to view thecomments powered by Disqus.

            Information & Contributors

            Information

            Published In

            cover image Applied Intelligence
            Applied Intelligence  Volume 55, Issue 4
            Feb 2025
            1237 pages

            Publisher

            Kluwer Academic Publishers

            United States

            Publication History

            Published: 04 January 2025
            Accepted: 04 December 2024

            Author Tags

            1. Batch-mode active learning
            2. Ordinal classification
            3. Informative
            4. Leadership tree

            Qualifiers

            • Research-article

            Funding Sources

            • Guangxi Natural Science Foundation
            • Project of Young Academic Innovation Team of Xiangsi Lake in Guangxi Minzu University
            • School Introduces Talents to Start Scientific Research Projects

            Contributors

            Other Metrics

            Bibliometrics & Citations

            Bibliometrics

            Article Metrics

            • 0
              Total Citations
            • 0
              Total Downloads
            • Downloads (Last 12 months)0
            • Downloads (Last 6 weeks)0
            Reflects downloads up to 28 Feb 2025

            Other Metrics

            Citations

            View Options

            View options

            Figures

            Tables

            Media

            Share

            Share

            Share this Publication link

            Share on social media