Abstract
A neural network classifier, called supervised extended ART (SEART), that incorporates a supervised mechanism into the extended unsupervised ART is presented here. It uses a learning theory called Nested Generalized Exemplar (NGE) theory. In any time, the training instances may or may not have desired outputs, that is, this model can handle supervised learning and unsupervised learning simultaneously. The unsupervised component finds the cluster relations of instances, and the supervised component learns the desired associations between clusters and classes. In addition, this model has the ability of incremental learning. It works equally well when instances in a cluster belong to different classes. Also, multi-category and nonconvex classifications can be dealt with. Besides, the experimental results are very encouraging.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
M. Bahrami, “Recognition of rules and exceptions by neural networks,” International Journal of Neural Systems, vol. 2, pp. 341–344, 1992.
G.A. Carpenter and S. Grossberg, “A massively parallel architecture for a self-organizing neural pattern recognition machine,” Computer Vision, Graphics, and Image Processing, vol. 37, pp. 54–115, 1987.
G.A. Carpenter and S. Grossberg, “ART 2: Self-organization of stable category recognition codes for analog input patterns,” Applied Optics, vol. 26, pp. 4919–4930, 1987.
G.A. Carpenter, “Neural network models for pattern recognition and associative memory,” Neural Networks, vol. 2, pp. 243–257, 1989.
G.A. Carpenter and S. Grossberg, “ART 3: Hierarchical search using chemical transmitters in self-organizing pattern recognition architectures,” Neural Networks, vol. 3, pp. 129–152, 1990.
G.A. Carpenter, S. Grossberg, and D.B. Rosen, “Fuzzy ART: Fast stable learning and categorization of analog patterns by an adaptive resonance system,” Neural Networks, vol. 4, pp. 759–771, 1991.
G.A. Carpenter, S. Grossberg, and D.B. Rosen, “ART 2-A: An adaptive resonance algorithm for rapid category learning and recognition,” Neural Networks, vol. 4, pp. 493–504, 1991.
G.A. Carpenter, S. Grossberg, and J.H. Reynolds, “ARTMAP: Supervised real-time learning and classification of nonstationary data by a self-organizing neural network,” Neural Networks, vol. 4, pp. 565–588, 1991.
G.A. Carpenter, S. Grossberg, N. Markuzon, J.H. Reynolds, and D.B. Rosen, “Fuzzy ARTMAP: A neural network architecture for incremental supervised learning of analog multidimensional maps,” IEEE Transactions on Neural Networks, vol. 3, pp. 689–713, 1992.
R.O. Duda and P.E. Hart, Pattern Classification and Scene Analysis, John Wiley and Sons: New York, 1973.
P.W. Frey and D.J. Slate, “Letter recognition using Hollandstyle adaptive classifiers,” Machine Learning, vol. 6, pp. 161–182, 1991.
G.E. Hinton, “Connectionist learning procedures,” Artificial Intelligence, vol. 40, pp. 185–234, 1989.
H.M. Lee and C.S. Lai, “Supervised Fuzzy ART: Training of a neural network for pattern classification via combining supervised and unsupervised learning,” IEEE International Conference on Neural Networks, California, 1993, vol. 1, pp. 323–328.
R.P. Lippmann, “An introduction to computing with neural nets,” IEEE ASSP Magazine, vol. 4, pp. 4–22, 1987.
J. McClelland and D. Rumelhard, Explorations in Parallel Distributed Processing, Bradford Books/MIT Press: Cambridge, 1988.
M.T. Musavi, K.H. Chan, D.M. Hummels, K. Kalantri, and W. Ahmed, “A probabilistic model for evaluation of neural network classifiers,” Pattern Recognition, vol. 25, no. 10, pp. 1241–1251, 1992.
S. Salzberg, “A nearest hyperrectangle learning method,” Machine Learning, vol. 6, pp. 251–276, 1991.
J.W. Shavlik, R.J. Mooney, and G.G. Towell, “Symbolic and neural learning algorithms: An experimental comparison,” Machine Learning, vol. 6, pp. 111–143, 1991.
P.K. Simpson, “Fuzzy min-max neural networks,” International Joint Conference on Neural Networks, Singapore, 1991, pp. 1658–1669.
P.K. Simpson, “Fuzzy min-max neural networks—Part 1: Classification,” IEEE Transactions on Neural Networks, vol. 3, pp. 776–786, 1992.
M. Wann, T. Hediger, and N.N. Greenbaun, “The influence of training sets on generalization,” International Joint Conference on Neural Networks, San Diego, 1990, vol. III, pp. 137–142.
S.M. Weiss and I. Kapouleas, “An empirical comparison of pattern recognition, neural nets, and machine learning classification methods,” Proceeding of Eleventh International Joint Conference on Artificial Intelligence, Detroit, MI, 1989, pp. 781–787.
D.H. Wolpert, “Stacked generalization,” Neural Networks, vol. 5, pp. 241–259, 1992.
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Lee, HM., Lai, CS. Supervised extended ART: A fast neural network classifier trained by combining supervised and unsupervised learning. Appl Intell 6, 117–128 (1996). https://doi.org/10.1007/BF00117812
Received:
Revised:
Accepted:
Issue Date:
DOI: https://doi.org/10.1007/BF00117812