Abstract
Facing ambiguities in regression problems is a challenge. There exists many powerful evolutionary schemes to deal with regression, however, these techniques do not usually take into account ambiguities (i.e. the existence of 2 or more solutions for some or all points in the domain). Nonetheless ambiguities are present in some real world inverse problems, and it is interesting in such cases to provide the user with a choice of possible solutions. We propose in this article an approach based on boosted genetic programming in order to propose several solutions when ambiguities are detected.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Roland Doerffer and Helmut Schiller. Neural network for retrieval of concentrations of water constituents with the possibility of detecting exceptional out of scope spectra. In proceedings of IEEE 2000 International Geoscience and Remote Sensing Symposium, volume II, pages 714–717, 2000.
R. E. Schapire. The strength of weak learnability. In Machine Learning, 5(2), pages 197–227, 1990.
Y. Freund and R.E. Schapire. Experiments with a new boosting algorithm. In Proceedings of the International Conference on Machine Learning, ICML96, 1996.
Tom Michael Mitchell. Machine Learning. Mc Graw-Hill, 1997.
Gregory Paris, Denis Robilliard, and Cyril Fonlupt. Applying boosting techniques to genetic programming. In Proceedings of the EA01 conference (to appear in LNCS), 2001.
R.E. Schapire Y. Freund. A short introduction to boosting. Journal of Japanese Society for Artificial Intelligence, pages 771–780, 1999.
L. G. Valiant. A theory of a learnable. Commun. ACM, 27(11), pages 1134–1142, November 1984.
L. Breiman. Bias, variance, and arcing classifiers. Technical Report Technical Report 460, Statistics Department, University of California at Berkeley, 1996.
L. Breiman. Arcing classifiers. The Annals of Statistics, 26(3):801–849, 1998.
A. E. Eiben and Zsofia Ruttkay. Self-adaptivity for constraint satisfaction: Learning penalty functions. In International Conference on Evolutionary Computation, pages 258–261, 1996.
H. Drucker. Improving regression unsing boosting techniques. In Proceedings of the International Conference on Machine Learning (ICML), 1997.
Hitoshi Iba. Bagging, boosting, and bloating in genetic programming. In [?], pages 1053–1060, 1999.
John Koza. Genetic Programming II: Automac Discovery of Reusable Programs. The MIT Press, 1994.
Brad L. Miller and Michael J. Shaw. Genetic algorithms with dynamic niche sharing for multimodal function optimization. In International Conference on Evolutionary Computation, pages 786–791, 1996.
Samir W. Mahfoud. Crossover interactions among niches. In Proc. of the First IEEE Conf. on Evolutionary Computation, volume 1, pages 188–193, Piscataway, NJ, 1994. IEEE Service Center.
D E Goldberg and J Richardson. Genetic algorithms with sharing for multi-modal function optimisation. In Proc of the 2nd Int. Conf. on Genetic Algorithms and Their Applications, pages 41-, 1987.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2003 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Paris, G., Robilliard, D., Fonlupt, C. (2003). Genetic Programming with Boosting for Ambiguities in Regression Problems. In: Ryan, C., Soule, T., Keijzer, M., Tsang, E., Poli, R., Costa, E. (eds) Genetic Programming. EuroGP 2003. Lecture Notes in Computer Science, vol 2610. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-36599-0_17
Download citation
DOI: https://doi.org/10.1007/3-540-36599-0_17
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-00971-9
Online ISBN: 978-3-540-36599-0
eBook Packages: Springer Book Archive