Nothing Special   »   [go: up one dir, main page]

skip to main content
article

An Experimental Comparison of the Nearest-Neighbor and Nearest-Hyperrectangle Algorithms

Published: 01 April 1995 Publication History

Abstract

Algorithms based on Nested Generalized Exemplar (NGE) theory (Salzberg, 1991) classify new data points by computing their distance to the nearest “generalized exemplar” (i.e., either a point or an axis-parallel rectangle). They combine the distance-based character of nearest neighbor (NN) classifiers with the axis-parallel rectangle representation employed in many rule-learning systems. An implementation of NGE was compared to the k-nearest neighbor (kNN) algorithm in 11 domains and found to be significantly inferior to kNN in 9 of them. Several modifications of NGE were studied to understand the cause of its poor performance. These show that its performance can be substantially improved by preventing NGE from creating overlapping rectangles, while still allowing complete nesting of rectangles. Performance can be further improved by modifying the distance metric to allow weights on each of the features (Salzberg, 1991). Best results were obtained in this study when the weights were computed using mutual information between the features and the output class. The best version of NGE developed is a batch algorithm (BNGE FWMI) that has no user-tunable parameters. BNGE FWMI's performance is comparable to the first-nearest neighbor algorithm (also incorporating feature weights). However, the k-nearest neighbor algorithm is still significantly superior to BNGE FWMI in 7 of the 11 domains, and inferior to it in only 2. We conclude that, even with our improvements, the NGE approach is very sensitive to the shape of the decision boundaries in classification problems. In domains where the decision boundaries are axis-parallel, the NGE approach can produce excellent generalization with interpretable hypotheses. In all domains tested, NGE algorithms require much less memory to store generalized exemplars than is required by NN algorithms.

Cited By

View all

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image Machine Language
Machine Language  Volume 19, Issue 1
April 1995
85 pages
ISSN:0885-6125
Issue’s Table of Contents

Publisher

Kluwer Academic Publishers

United States

Publication History

Published: 01 April 1995

Author Tags

  1. exemplar-based learning
  2. feature weights
  3. instance-based learning
  4. nearest neighbors
  5. nested generalized exemplars

Qualifiers

  • Article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 19 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2018)A Hybrid Case Based Reasoning Model for Classification in Internet of Things IoT EnvironmentJournal of Organizational and End User Computing10.4018/JOEUC.201810010730:4(104-122)Online publication date: 1-Oct-2018
  • (2017)Tri-partition neighborhood covering reduction for robust classificationInternational Journal of Approximate Reasoning10.1016/j.ijar.2016.11.01083:C(371-384)Online publication date: 1-Apr-2017
  • (2017)MoNGELPattern Analysis & Applications10.1007/s10044-015-0506-y20:2(441-452)Online publication date: 1-May-2017
  • (2017)A novel machine learning method based on generalized behavioral learning theoryNeural Computing and Applications10.1007/s00521-016-2314-828:12(3921-3939)Online publication date: 1-Dec-2017
  • (2017)Hybrid case-based reasoning system by cost-sensitive neural network for classificationSoft Computing - A Fusion of Foundations, Methodologies and Applications10.1007/s00500-016-2312-x21:24(7579-7596)Online publication date: 1-Dec-2017
  • (2016)A cost-sensitive classification algorithmKnowledge-Based Systems10.1016/j.knosys.2015.12.01095:C(99-113)Online publication date: 1-Mar-2016
  • (2015)Case-based reasoning for predicting the success of therapyExpert Systems: The Journal of Knowledge Engineering10.1111/exsy.1207432:2(165-177)Online publication date: 1-Apr-2015
  • (2015)IRAHCPattern Recognition10.1016/j.patcog.2014.11.00548:5(1878-1889)Online publication date: 1-May-2015
  • (2015)A joint generalized exemplar method for classification of massive datasetsApplied Soft Computing10.1016/j.asoc.2015.07.04436:C(487-498)Online publication date: 1-Nov-2015
  • (2014)An efficient construction and application usefulness of rectangle greedy coversPattern Recognition10.1016/j.patcog.2013.09.00847:3(1459-1468)Online publication date: 1-Mar-2014
  • Show More Cited By

View Options

View options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media