Abstract
The ability to map and solve combinatorial optimization problems with constraints on neural networks has frequently motivated a proposal for using such a model of computation.
We introduce a new stochastic neural model, working out for a specific class of constraints, which is able to choose adaptively its weights in order to find solutions into a proper subspace (feasible region) of the search space.
We show its asymptotic convergence properties and give evidence of its ability to find hight quality solution on benchmark and randomly generated instances of a specific problem.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Smith, K.: Neural networks for combinatorial optimization: A review of more than a decade of research (1999)
Garey, M.R., Johnson, D.S.: Computers and Intractability. A Guide to the Theory of NP-Completeness. W. H. Freeman & Co., San Francisco (1979)
Karp, R.M.: Complexity of Computer Computations. In: Reducibility among Combinatorial Problems, pp. 85–103. Plenum Press, New York (1972)
Boros, H.: Pseudo-boolean optimization. DAMATH: Discrete Applied Mathematics and Combinatorial Operations Research and Computer Science 123 (2002)
Bertoni, A., Campadelli, P., Grossi, G.: A neural algorithm for the maximum clique problem: Analysis, experiments and circuit implementation. Algoritmica 33(1), 71–88 (2002)
Hopfield, J.J.: Neurons with graded response have collective computational properties like those of two-state neurons. In: Proceedings of the National Academy of Sciences. NAS, vol. 81, pp. 3088–3092 (1984)
Glauber, R.J.: Time-dependent statistics of the Ising model. Journal of Mathematical Physics 4(2), 294–307 (1963)
Kirkpatrick, S., Gelatt, C., Vecchi, M.: Optimization by simulated annealing. Science 220, 671–680 (1983)
Laarhoven, P.J.M., Aarts, E.H.L.: Simulated Annealing: Theory and Applications. Mathematics and its Applications. Reidel Publisching Company (1987)
Ackley, D., Hinton, G., Sejnowski, T.: A learning algorithm for boltzmann machines. Cognitive Science 9 (1985)
Aarts, E., Korst, J.: Simulated annealing and Boltzmann machines: a stochastic approach to combinatorial optimization and neural computing. John Wiley & Sons, Inc., New York (1989)
Hertz, J., Krogh, A., Palmer, R.G.: Introduction to the Theory of Neural Computation. Addison-Wesley, Reading (1991)
Grossi, G., Posenato, R.: A distributed algorithm for max independent set problem based on Hopfield networks. In: Marinaro, M., Tagliaferri, R. (eds.) WIRN 2002. LNCS, vol. 2486, pp. 64–74. Springer, Heidelberg (2002)
Matula, D.: On the complete subgraph of a random graph. Combinatory Mathematics and its Applications, 356–369 (1970)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Grossi, G. (2006). A Discrete Adaptive Stochastic Neural Model for Constrained Optimization. In: Kollias, S.D., Stafylopatis, A., Duch, W., Oja, E. (eds) Artificial Neural Networks – ICANN 2006. ICANN 2006. Lecture Notes in Computer Science, vol 4131. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11840817_67
Download citation
DOI: https://doi.org/10.1007/11840817_67
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-38625-4
Online ISBN: 978-3-540-38627-8
eBook Packages: Computer ScienceComputer Science (R0)