Nothing Special   »   [go: up one dir, main page]

skip to main content
article
Free access

Connectionist expert systems

Published: 01 February 1988 Publication History

Abstract

Connectionist networks can be used as expert system knowledge bases. Furthermore, such networks can be constructed from training examples by machine learning techniques. This gives a way to automate the generation of expert systems for classification problems.

References

[1]
Anderson, J.A., and Rosenfeld, E., Eds. Neurocomputing, A Reader. MIT Press, Cambridge, Mass., 1988.]]
[2]
Barto, A.G., and Anandan, P. Pattern recognizing stochastic learning automata. IEEE Trans. Syst. Man Cybern. 15, 1985, 360--375.]]
[3]
Bundy, A., Silver, B., and Plummer, D. An analytical comparison of some rule-learning programs. Artif. Intell. 27, 2 (November 1985), 137-'181.]]
[4]
Cheeseman, P.C. A method of computing generalized Bayesian probability values for expert systems. In Proceedings of the 8th International Joint Conference on Artificial Intelligence (Karlsruhe, W. Germany, Aug. 8-12). 1983, pp. 198-202.]]
[5]
Cheeseman, P.C. Learning of expert system data. In Proceedings of the IEEE Workshop on Principles of Knowledge Based Systems (Denver, Colo., Dec. 3-4). IEEE Press, New York, 1984, pp. 115-.122.]]
[6]
ft. Bavis, R., and Lenat, D.B. Knowledge-Based Systems in Artificial Intelligence. McGraw-Hill, New York, 1980.]]
[7]
Duda, R.O., and Shortliffe, E.H. Expert systems research. Science 220, 4594 (Apr. 15, 1983), 261-268.]]
[8]
Fisher, R.A. The use of multiple measurements in taxonomic problems. Ann. Eugen. 7, (1936) Part II, 179-188. {Also in Contributions to Mathematical Statistics, Wiley, New York, 1950.)]]
[9]
Fukushima, K, Miyake, S., and {to, T. Neocognitron: A neural network model for a mechanism of visual pattern recognition. IEEE Trans. Syst. Man Cybern. SMC-13, 5 (Sept.-Oct. 1983), 826-834.]]
[10]
Gallant, S.I. Automatic generation of expert systems fi'om examples. In Proceedings of the 2nd International Conference on Artificial Intelligence Applications (Miami Beach, FI., Dec. 11-13). IEEE Press, New York, 1985, pp. 313-319.]]
[11]
Gallant, S.I. Matrix controlled expert system producible from examples. U.S. Patent Pending 707,458, 1985.]]
[12]
Gallant, S.{. Brittleness and machine learning. In International Meeting on Advances in Learning sponsored by Association Franqaise pour l'Apprentissage Symbolique Automatique, CNRS, Paris, France. (Les Arcs, France, July 28-Aug. 1, 1986).]]
[13]
Gallant, S.I. Optimal linear discriminants. In Proceedings of the 8th International Conference on Pattern Recognition (Paris, France, Oct. 28-31}. IEEE Press, New York, 1986, pp. 849-852.]]
[14]
Gallant, S.I. Automated generation of expert systems for problems involving noise and redundancy. In AAAI Workshop on Uncertainty in Artificial Intelligence sponsored by AAAI. (Seattle, Wash., July 10-12, 1987), pp. 212-221.]]
[15]
Gallant, S.I. Bayesian assessment of a connectionist model for fault detection. Tech. Rep. NU-CCS-87-25, College of Computer Science, Northeastern Univ., Boston, Mass., 1987.]]
[16]
Gallant, S.I., and Balachandra, R. Using automated techniques to generate an expert system for R/D project monitoring. In International Conference on Economics and Artificial Intelligence sponsored by AFCET, Paris, France. (Aix-en-Provence, France, Sept. 2-4, 1986), pp. 87-92.]]
[17]
Gallant, S.I., and Smith, D. Random cells: An idea whose time has come and gone ... and come again? In IEEE Internatior~'al Conference on Neural Networks (San Diego, Calif., June). {EEE Press, New York, 1987, pp. 21-24.]]
[18]
Grossberg, S. Studies of Mind and Brain. Reidel, Hingham, Mass.]]
[19]
Hinton, G.E., and Anderson, J.A., Eds. Parallel Models of Associative Memory. Erlbaum, Hillsdale, N.J., 1981.]]
[20]
Hopfield, J.J. Neural networks and physical systems with emergent collective computational abilities. In Proceedings of the National Academy of Sciences USA. National Academy of Sciences, Washington, D.C., 1982, vol. 79, 2554-2558.]]
[21]
Kim, J.H., and Pearl, J. CONVINCE: A conversational inference consolidation engine. IEEE Trans. Syst. Man Cybern. SMC-17, 2 (Mar.- Apr. 1987), 120-132.]]
[22]
McClelland, J.L., and Rumelhart, D.E., Eds. Parallel Distributed Processing: Explorations in the Microstructures of Cognition. Vol. 2. MIT Press, Cambridge, Mass. (1986).]]
[23]
McCulloch, W.S. and Pitts, W.H. A logical calculus of the ideas imminent in nervous activity. Bull. Math. Biophys. 5 (1943), 115-133. (Reprinted: McCulloch, W.S. Embodiments of Mind. MIT Press, Cambridge, Mass., 1965.)]]
[24]
McDermott, J. RI: The formative years. AI Meg. 2, 2 (1981), 21-29.]]
[25]
Michalski, R.S., Carbonell. J.G., and Mitchell, T.M. Machine Learning. Tioga, Pale Alto, Calif., 1983.]]
[26]
Michalski, R.S., Carbonell, J.G., Mitchell, T.M. Machine Learning. Vol. 2. Kaufmann, Los Altos, Calif., 1986.]]
[27]
Minsky, M., and Papert, S. Perceptrons: An Introduction to Computational Geometry. MIT Press, Cambridge, Mass., 1969.]]
[28]
Nilsson, N.J. Learning Machines. McGraw-Hill, New York, 1965.]]
[29]
Pearl, J. How to do with probabilities what people say you can't. In Proceedings of the 2nd International Conference on Artificial Intelligence Applications (Miami Beach, Fla., Dec. 11-13). IEEE Press, New York, 1985, pp. 6-12.]]
[30]
Pearl, J. Fusion, propagation, and structuring in belief networks. Artif. InteU. 29, 3 (Sept. 1986), 241-288.]]
[31]
Pearl, J. The logic of representing dependencies by directed graphs. In AAAI-87 sponsored by AAAI, Menlo Park, Calif. (Seattle, Wash. July 13-17, 1987), pp. 374-379.]]
[32]
Quinlan, J.R. Learning efficient classification procedures and their application to chess end games. In Machine Learning, Eds. R.S. Michalski, J.G. Carbonell, and T.M. Mitchell, Tioga, Pale Alto, Calif., 1983.]]
[33]
Rosenblatt, F. Principles of Neurodynamics: Perceptrons and the Theory of Brain Mechanisms. Spartan, Washington, D.C., 1961.]]
[34]
Rumelhart, D.E., and McClelland, I.L., Eds. Parallel Distributed Processing: Explorations in the Microstructures of Cognition. Vol. 1. MIT Press, Cambridge, Mass.]]
[35]
Werbos, P.J. Beyond regression: New tools for prediction and analysis in the behavioral sciences. Ph.D. thesis, Dept. of Applied Mathematics, Harvard Univ., Cambridge, Mass., 1974.]]

Cited By

View all
  • (2024)The application value of deep learning in the background of precision medicine in glioblastomaScience Progress10.1177/00368504231223353107:1Online publication date: 23-Jan-2024
  • (2024)Explainable Neural Networks: Achieving Interpretability in Neural ModelsArchives of Computational Methods in Engineering10.1007/s11831-024-10089-431:6(3535-3550)Online publication date: 21-Mar-2024
  • (2024) Credit risk modelling within the euro area in the COVID ‐19 period: Evidence from an ICAS framework International Journal of Finance & Economics10.1002/ijfe.2957Online publication date: 7-Mar-2024
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 01 February 1988
Published in CACM Volume 31, Issue 2

Permissions

Request permissions for this article.

Check for updates

Qualifiers

  • Article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)160
  • Downloads (Last 6 weeks)16
Reflects downloads up to 30 Sep 2024

Other Metrics

Citations

Cited By

View all
  • (2024)The application value of deep learning in the background of precision medicine in glioblastomaScience Progress10.1177/00368504231223353107:1Online publication date: 23-Jan-2024
  • (2024)Explainable Neural Networks: Achieving Interpretability in Neural ModelsArchives of Computational Methods in Engineering10.1007/s11831-024-10089-431:6(3535-3550)Online publication date: 21-Mar-2024
  • (2024) Credit risk modelling within the euro area in the COVID ‐19 period: Evidence from an ICAS framework International Journal of Finance & Economics10.1002/ijfe.2957Online publication date: 7-Mar-2024
  • (2024)XAI is in troubleAI Magazine10.1002/aaai.1218445:3(300-316)Online publication date: 25-Sep-2024
  • (2023)Editing boolean classifiersProceedings of the Thirty-Seventh AAAI Conference on Artificial Intelligence and Thirty-Fifth Conference on Innovative Applications of Artificial Intelligence and Thirteenth Symposium on Educational Advances in Artificial Intelligence10.1609/aaai.v37i5.25801(6516-6524)Online publication date: 7-Feb-2023
  • (2023)Knowledge Transfer-Based Sparse Deep Belief NetworkIEEE Transactions on Cybernetics10.1109/TCYB.2022.317363253:12(7572-7583)Online publication date: Dec-2023
  • (2023)Integrative Narratology and Its Application in AI-based Virtual Education System2023 7th International Conference on E-Society, E-Education and E-Technology (ESET)10.1109/ESET60968.2023.00009(8-17)Online publication date: 13-Oct-2023
  • (2022)From LSAT: The Progress and Challenges of Complex ReasoningIEEE/ACM Transactions on Audio, Speech, and Language Processing10.1109/TASLP.2022.316421830(2201-2216)Online publication date: 2022
  • (2022)Hybrid Knowledge Extraction Framework Using Modified Adaptive Genetic Algorithm and BPNNIEEE Access10.1109/ACCESS.2022.318868910(72037-72050)Online publication date: 2022
  • (2022)Rule extraction using ensemble of neural network ensemblesCognitive Systems Research10.1016/j.cogsys.2022.07.00475:C(36-52)Online publication date: 1-Sep-2022
  • Show More Cited By

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Get Access

Login options

Full Access

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media