Abstract
Faraway engineers are able to sketch direct the shape of engineering components by the browser, and the recognition system will proceed with search for the component database of company by the Internet. In this paper, component patterns are stored in the database system. Component patterns with the approach of database system will be able to improve the capacity of recognition system effectively. In our approach, the recognition system adopts distributed compute, and it will raise the recognition rate of system. The system uses a recurrent neural network (RNN) with associative memory to perform the action of training and recognition. The final phase joins the technology of database match in process of the recognition except distributed compute, and it will solve the problem of spurious state. In this paper, our system will be carried out in the Yang-Fen Automation Electrical Engineering Company. The plan of experiment has gone through four months, and their engineers are also used to take advantage of the way of Web-Based pattern recognition.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Singh, S., “A Long Memory Pattern Modeling and Recognition System for Financial Forecasting,” Pattern Analysis and Applications, vo1. 2, no. 3, (1999) 264–273.
S. Kak, “Better Web Searches and Prediction with Instantaneously Trained Neural Networks” IEEE Intelligent Systems, vol. 14, no. 6, (1999) 78–81.
R.P.W. Duin, “Superlearning and neural network magic,” Pattern Recognition Letters, vol. 15, 1994, pp. 215–217.
M.A. Kraaijveld and R.P.W. Duin, “The effective capacity of multilayer feedforward network classifiers,”Proc.12th Int’l Conf. on Pattern Recognition.(ICPR 94), Israel, vol. B,(1994). 99–103.
Z. TAN and M.K. ALI, “Pattern recognition with stochastic resonance in a generic neural network,” International Journal of Modern Physics C, vo1. 11, no. 8, (2000)1585–1593.
M. Perus, “Neural networks as a basis for quantum associative networks,” Neural Network World, vol. 10, no. 6, (2000) 1001–1013.
Brouwer, R.K., “An Integer Recurrent Artificial Neural Network for classifying Feature Vectors,” International Journal of Pattern Recognition and Artificial Intelligence, vol. 14, no. 3, (2000) 339–335.
Brouwer, R.K., “A Fuzzy Recurrent Artificial Neural Network for Pattern classification, “International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems, vo1. 8, no. 5, (2000) 525–538.
Kamp, Y. and Hasler, M., Recursive Neural Networks for Associative Memory,: Wiley-Interscience Series in Systems and Optimization, England, (1990) 10–34.
V. Gimenez, L. Aslanyan, J. Catellanos, and V. Ryazanov. “Distribution Functions as Attractor for Recurrent Neural Networks,” Pattern Recognition and Image Analysis. vol. 11, no. 3, (2001) 492–497.
J.J. Hopfield, “Neural networks and physical systems with emergent collective computational abilities,” Proc. the National Academy of sciences, USA,vol. 79, (1982) 2554–2558..
Simon Haykin, Neural networks a comprehensive foundation, 2nd, Macmillan College Publishing Company, Inc., New York (1999).
J.J. Hopfield and D.W. Tank, “Computing with neural circuits: a model,” Science, vol. 233, (1986) 625–633.
B. Mueller, J. Reinhardt, and M. T. Strickland, Neural Networks, Springer-Verlag, Berlin Heidelberg (1995).
Zurada, J.M., Artificial Neural Systems, West Publishing, St. Paul, UN. (1992).
Lippmann, R.P., “An Introduction to Computing with Neural Nets,” IEEE ASSP Mag., (1987)4–22, also reprinted in neural networks: Theoretical Foundations and Analysis, edited by C. Lau, IEEE Press, New York, (1992) 5-23.
W. A. Little and G.. L. Shaw, “Analytical study of the memory storage capacity of a neural network, “Mathematical Biosciences, vo1. 39,no. 1, (1978) 281–290.
Simon Haykin, Neural networks a comprehensive foundation, Macmillan College Publishing Company, Inc., New York(1994).
Jinwen. Ma, “A Neural Network Approach to real-time pattern recognition,” International Journal of Pattern Recognition and Artificial Intelligence, vol. 15, no. 6, 2001, pp. 934–947.
Ma, J.W., “The stability of the generalized Hopfield networks in randomly asynchronous mode,” Neural Networks, vol. 10, no. 6, (1997) 1109–1116.
R.E. McEliece, E.C. Posner, E.R. Rodernich and S.S. VenKatesh, “The capacity of the Hopfield associative memory, “IEEE Trans. Inform. In.IT, vol. 33, no. 2, (1987) 461–483.
L.F. Abbott and T.B. Kepler, “Optimal learning in neural network memories, “J.Phys. A:Math. General, vol. 22, (1989) 711–717.
S.S. Venkatesh and D. Pitts, “Linear and logarithmic capacities in associative memory, “IEEE Trans. Inform. Th. IT, vol. 35, (1989) 558–568.
D.J. Amit, Modeling Brain Function: The World of Attractor Neural Networks, Cambridge University Press, Net York (1989).
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2002 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Hsiao, SJ., Fan, KC., Sung, WT., Ou, SC. (2002). Using a Real-Time Web-Based Pattern Recognition System to Search for Component Patterns Database. In: Shafazand, H., Tjoa, A.M. (eds) EurAsia-ICT 2002: Information and Communication Technology. EurAsia-ICT 2002. Lecture Notes in Computer Science, vol 2510. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-36087-5_86
Download citation
DOI: https://doi.org/10.1007/3-540-36087-5_86
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-00028-0
Online ISBN: 978-3-540-36087-2
eBook Packages: Springer Book Archive