Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1109/ESEM.2011.33guideproceedingsArticle/Chapter ViewAbstractPublication PagesConference Proceedingsacm-pubtype
Article

Handling Estimation Uncertainty with Bootstrapping: Empirical Evaluation in the Context of Hybrid Prediction Methods

Published: 22 September 2011 Publication History

Abstract

Reliable predictions are essential for managing software projects with respect to cost and quality. Several studies have shown that hybrid prediction models combining causal models with Monte Carlo simulation are especially successful in addressing the needs and constraints of today's software industry: They deal with limited measurement data and, additionally, make use of expert knowledge. Moreover, instead of providing merely point estimates, they support the handling of estimation uncertainty, e.g., estimating the probability of falling below or exceeding a specific threshold. Although existing methods do well in terms of handling uncertainty of information, we can show that they leave uncertainty coming from imperfect modeling largely unaddressed. One of the consequences is that they probably provide over-confident uncertainty estimates. This paper presents a possible solution by integrating bootstrapping into the existing methods. In order to evaluate whether this solution does not only theoretically improve the estimates but also has a practical impact on the quality of the results, we evaluated the solution in an empirical study using data from more than sixty projects and six estimation models from different domains and application areas. The results indicate that the uncertainty estimates of currently used models are not realistic and can be significantly improved by the proposed solution.

Cited By

View all
  • (2023)CoBRA without expertsJournal of Software: Evolution and Process10.1002/smr.256935:12Online publication date: 25-Apr-2023
  • (2019)Software Effort Interval Prediction via Bayesian Inference and Synthetic Bootstrap ResamplingACM Transactions on Software Engineering and Methodology10.1145/329570028:1(1-46)Online publication date: 9-Jan-2019
  • (2016)Evaluation of estimation models using the Minimum Interval of EquivalenceApplied Soft Computing10.1016/j.asoc.2016.03.02649:C(956-967)Online publication date: 1-Dec-2016
  • Show More Cited By

Index Terms

  1. Handling Estimation Uncertainty with Bootstrapping: Empirical Evaluation in the Context of Hybrid Prediction Methods
    Index terms have been assigned to the content through auto-classification.

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image Guide Proceedings
    ESEM '11: Proceedings of the 2011 International Symposium on Empirical Software Engineering and Measurement
    September 2011
    473 pages
    ISBN:9780769546049

    Publisher

    IEEE Computer Society

    United States

    Publication History

    Published: 22 September 2011

    Author Tags

    1. CoBRA
    2. HyDEEP
    3. Monte Carlo simulation
    4. defect prediction
    5. effort estimation
    6. empirical study

    Qualifiers

    • Article

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)0
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 03 Oct 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2023)CoBRA without expertsJournal of Software: Evolution and Process10.1002/smr.256935:12Online publication date: 25-Apr-2023
    • (2019)Software Effort Interval Prediction via Bayesian Inference and Synthetic Bootstrap ResamplingACM Transactions on Software Engineering and Methodology10.1145/329570028:1(1-46)Online publication date: 9-Jan-2019
    • (2016)Evaluation of estimation models using the Minimum Interval of EquivalenceApplied Soft Computing10.1016/j.asoc.2016.03.02649:C(956-967)Online publication date: 1-Dec-2016
    • (2014)The potential benefit of relevance vector machine to software effort estimationProceedings of the 10th International Conference on Predictive Models in Software Engineering10.1145/2639490.2639510(52-61)Online publication date: 17-Sep-2014

    View Options

    View options

    Get Access

    Login options

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media