Nothing Special   »   [go: up one dir, main page]

×
Please click here if you are not redirected within a few seconds.
Feb 18, 2013 · This short note investigates the extent to which published analyses based on the NASA defect datasets are meaningful and comparable.
This short note investigates the extent to which published analyses based on the NASA defect datasets are meaningful and comparable.
This study assesses the quality of 13 datasets that have been used extensively in research on software effort estimation and proposes a template.
Feb 18, 2013 · In recent years, there has been much interest in using machine learners to classify software modules into defect-prone and not defect-prone ...
Results--We find important differences between the two versions of the datasets, implausible values in one dataset and generally insufficient detail documented ...
Mar 31, 2018 · Data quality: Some comments on the nasa software defect datasets. M Shepperd, Q Song, Z Sun, C Mair. IEEE Transactions on Software Engineering ...
Conclusion: Even after systematic data cleaning of the NASA MDP datasets, we found new erroneous data. Data quality should always be explicitly considered by ...
(2013) Data Quality: Some Comments on the NASA Software Defect Datasets, IEEE Transactions on Software Engineering, 39. Tim Menzies and Justin S. Di Stefano ...
Jun 1, 2016 · Data quality: Some comments on the NASA software defect datasets. Software Engineering, IEEE Transactions on,. 39(9):1208–1215, Sept 2013. [9] ...
This dissertation identifies many data quality and methodological issues in previous defect prediction studies, and leads to a new proposed methodology for ...