Nothing Special   »   [go: up one dir, main page]

Skip to main content

Genetic Programming with Boosting for Ambiguities in Regression Problems

  • Conference paper
  • First Online:
Genetic Programming (EuroGP 2003)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 2610))

Included in the following conference series:

Abstract

Facing ambiguities in regression problems is a challenge. There exists many powerful evolutionary schemes to deal with regression, however, these techniques do not usually take into account ambiguities (i.e. the existence of 2 or more solutions for some or all points in the domain). Nonetheless ambiguities are present in some real world inverse problems, and it is interesting in such cases to provide the user with a choice of possible solutions. We propose in this article an approach based on boosted genetic programming in order to propose several solutions when ambiguities are detected.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Roland Doerffer and Helmut Schiller. Neural network for retrieval of concentrations of water constituents with the possibility of detecting exceptional out of scope spectra. In proceedings of IEEE 2000 International Geoscience and Remote Sensing Symposium, volume II, pages 714–717, 2000.

    Google Scholar 

  2. R. E. Schapire. The strength of weak learnability. In Machine Learning, 5(2), pages 197–227, 1990.

    Google Scholar 

  3. Y. Freund and R.E. Schapire. Experiments with a new boosting algorithm. In Proceedings of the International Conference on Machine Learning, ICML96, 1996.

    Google Scholar 

  4. Tom Michael Mitchell. Machine Learning. Mc Graw-Hill, 1997.

    Google Scholar 

  5. Gregory Paris, Denis Robilliard, and Cyril Fonlupt. Applying boosting techniques to genetic programming. In Proceedings of the EA01 conference (to appear in LNCS), 2001.

    Google Scholar 

  6. R.E. Schapire Y. Freund. A short introduction to boosting. Journal of Japanese Society for Artificial Intelligence, pages 771–780, 1999.

    Google Scholar 

  7. L. G. Valiant. A theory of a learnable. Commun. ACM, 27(11), pages 1134–1142, November 1984.

    Article  MATH  Google Scholar 

  8. L. Breiman. Bias, variance, and arcing classifiers. Technical Report Technical Report 460, Statistics Department, University of California at Berkeley, 1996.

    Google Scholar 

  9. L. Breiman. Arcing classifiers. The Annals of Statistics, 26(3):801–849, 1998.

    Article  MATH  MathSciNet  Google Scholar 

  10. A. E. Eiben and Zsofia Ruttkay. Self-adaptivity for constraint satisfaction: Learning penalty functions. In International Conference on Evolutionary Computation, pages 258–261, 1996.

    Google Scholar 

  11. H. Drucker. Improving regression unsing boosting techniques. In Proceedings of the International Conference on Machine Learning (ICML), 1997.

    Google Scholar 

  12. Hitoshi Iba. Bagging, boosting, and bloating in genetic programming. In [?], pages 1053–1060, 1999.

    Google Scholar 

  13. John Koza. Genetic Programming II: Automac Discovery of Reusable Programs. The MIT Press, 1994.

    Google Scholar 

  14. Brad L. Miller and Michael J. Shaw. Genetic algorithms with dynamic niche sharing for multimodal function optimization. In International Conference on Evolutionary Computation, pages 786–791, 1996.

    Google Scholar 

  15. Samir W. Mahfoud. Crossover interactions among niches. In Proc. of the First IEEE Conf. on Evolutionary Computation, volume 1, pages 188–193, Piscataway, NJ, 1994. IEEE Service Center.

    Google Scholar 

  16. D E Goldberg and J Richardson. Genetic algorithms with sharing for multi-modal function optimisation. In Proc of the 2nd Int. Conf. on Genetic Algorithms and Their Applications, pages 41-, 1987.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2003 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Paris, G., Robilliard, D., Fonlupt, C. (2003). Genetic Programming with Boosting for Ambiguities in Regression Problems. In: Ryan, C., Soule, T., Keijzer, M., Tsang, E., Poli, R., Costa, E. (eds) Genetic Programming. EuroGP 2003. Lecture Notes in Computer Science, vol 2610. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-36599-0_17

Download citation

  • DOI: https://doi.org/10.1007/3-540-36599-0_17

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-00971-9

  • Online ISBN: 978-3-540-36599-0

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics