Abstract
Neuro-symbolic integration merges background knowledge and neural networks to provide a more effective learning system. It uses the Core Method as a means to encode rules. However, this method has several drawbacks in dealing with rules that have temporal extent. First, it demands some interface with the world which buffers the input patterns so they can be represented all at once. This imposes a rigid limit on the duration of patterns and further suggests that all input vectors be the same length. These are troublesome in domains where one would like comparable representations for patterns that are of variable length (e.g. language). Second, it does not allow dynamic insertion of rules conveniently. Finally and also most seriously, it cannot encode rules having preconditions satisfied at non-deterministic time points – an important class of rules. This paper presents novel methods for encoding such rules, thereby improves and extends the power of the state-of-the-art neuro-symbolic integration.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Marques, N.C., Bader, S., Rocio, V., Hölldobler, S.: Neuro-Symbolic Word Tagging. In: Workshop on Text Mining and Applications, Portuguese Conf. on Artificial Intelligence. IEEE, Los Alamitos (2007)
Elman, J.L.: Finding Structure in Time. Cognitive Science 14, 179–211 (1990)
Hölldobler, S., Kalinke, Y.: Towards a massively parallel computational model for logic programming. In: Workshop on Combining Symbolic and Connectionist Processing. ECAI, pp. 68–77 (1994)
Bader, S., Hölldobler, S., Marques, N.C.: Guiding backprop by inserting rules. In: Procs. 4th Intl. Workshop on Neural-Symbolic Learning and Reasoning (2008)
Marques, N.C.: An Extension of the Core Method for Continuous Values: Learning with Probabilities. In: Procs. 14th Portuguese Conf. on Artificial Intelligence, pp. 319–328 (2009)
Marques, N.C., Lopes, J.G.: Using Neural Nets for Portuguese Part-of-Speech Tagging. In: Procs. the 5th Intl Conf. on Cognitive Science of Natural Language Processing, Ireland (1996)
Bader, S., Hitzler, P., Hölldobler, S.: Connectionist model generation: A first-order approach. Neurocomputing 51, 2420–2432 (2008)
Marques, N.C., Lopes, G.P.: Neural networks, part-of-speech tagging and lexicons. In: Hoffmann, F., Adams, N., Fisher, D., Guimarães, G., Hand, D.J. (eds.) IDA 2001. LNCS, vol. 2189, pp. 63–72. Springer, Heidelberg (2001)
d’Avila Garcez, A.S., Broda, K.B., Gabbay, D.M.: Neural- Symbolic Learning Systems Foundations and Applications. In: Perspectives in Neural Computing, Springer, Berlin (2002)
Zell, A.: SNNS, stuttgart neural network simulator, user manual, version 2.1. Technical report, Stuttgart (1992)
Pereira, F.C.N., Shieber, S.M.: A Prolog and natural-language analysis. CSLI Lecture Notes, vol. 10 (1987)
Mitchell, T.M.: Machine Learning. McGraw-Hill, New York (March 1997)
Haykin, S.: Neural networks: a comprehensive foundation. Prentice Hall, Englewood Cliffs (1999)
van Halteren, H.: Syntactic Wordclass Tagging. Kluwer Academic Publishers, Dordrecht (1999)
Sampson, G.: English for the Computer: The SUSANNE Corpus and Analytic Scheme. Oxford University Press, Oxford (1995)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2010 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
The Anh, H., Marques, N.C. (2010). Towards Encoding Background Knowledge with Temporal Extent into Neural Networks. In: Bi, Y., Williams, MA. (eds) Knowledge Science, Engineering and Management. KSEM 2010. Lecture Notes in Computer Science(), vol 6291. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-15280-1_9
Download citation
DOI: https://doi.org/10.1007/978-3-642-15280-1_9
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-15279-5
Online ISBN: 978-3-642-15280-1
eBook Packages: Computer ScienceComputer Science (R0)