Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3422392.3422415acmotherconferencesArticle/Chapter ViewAbstractPublication PagessbesConference Proceedingsconference-collections
research-article

Oracles of Bad Smells: a Systematic Literature Review

Published: 21 December 2020 Publication History

Abstract

A bad smell is an evidence of a design problem that may be harmful to the software maintenance. Several studies have been carried out to aid the identification of bad smells, by defining approaches or tools. Usually, the evaluation of these studies' results relies on data of oracles bad smells. An oracle is a set of data of bad smells found in a given software system. Such data serves as a referential template or a benchmark to evaluate the proposals on detecting bad smells. The availability and the quality of bad smell oracles are crucial to assert the quality of detection strategies of bad smells. This study aims to compile the bad smell oracles proposed in the literature. To achieve this, we conducted a Systematic Literature Review (SLR) to identify bad smell oracles and their characteristics. The main result of this study is a catalog of bad smell oracles that may be useful for research on bad smells, especially the studies that propose tools or detection strategies for bad smells.

References

[1]
R. Abílio, J. Padilha, E. Figueiredo, and H. Costa. 2015. Detecting Code Smells in Software Product Lines - An Exploratory Study. In 2015 12th International Conference on Information Technology - New Generations. 433--438. https://doi.org/10.1109/ITNG.2015.76
[2]
L. Amorim, E. Costa, N. Antunes, B. Fonseca, and M. Ribeiro. 2015. Experience report: Evaluating the effectiveness of decision trees for detecting code smells. In 2015 IEEE 26th International Symposium on Software Reliability Engineering (ISSRE). 261--269. https://doi.org/10.1109/ISSRE.2015.7381819
[3]
M. Aniche, G. Bavota, C. Treude, A. V. Deursen, and M. A. Gerosa. 2016. A Validated Set of Smells in Model-View-Controller Architectures. In 2016 IEEE International Conference on Software Maintenance and Evolution (ICSME). 233--243. https://doi.org/10.1109/ICSME.2016.12
[4]
Francesca Arcelli Fontana, Mika V. Mäntylä, Marco Zanoni, and Alessandro Marino. 2016. Comparing and Experimenting Machine Learning Techniques for Code Smell Detection. Empirical Softw. Engg. 21, 3 (June 2016), 1143--1191. https://doi.org/10.1007/s10664-015-9378-4
[5]
Lisa Börjesson. 2016. Research Outside Academia? An Analysis of Resources in Extra-Academic Report Writing (ASIST '16). American Society for Information Science, USA, Article 36, 10 pages.
[6]
William J. Brown, Raphael C. Malveau, Hays W. "Skip" McCormick, and Thomas J. Mowbray. 1998. AntiPatterns: Refactoring Software, Architectures, and Projects in Crisis: Refactoring Software, Architecture and Projects in Crisis (1. auflage ed.). John Wiley & Sons.
[7]
Boyuan Chen and Zhen Ming (Jack) Jiang. 2017. Characterizing and Detecting Anti-patterns in the Logging Code. In Proceedings of the 39th International Conference on Software Engineering (Buenos Aires, Argentina) (ICSE '17). IEEE Press, Piscataway, NJ, USA, 71--81. https://doi.org/10.1109/ICSE.2017.15
[8]
Z. Chen, L. Chen, W. Ma, and B. Xu. 2016. Detecting Code Smells in Python Programs. In 2016 International Conference on Software Analysis, Testing and Evolution (SATE). 18--23. https://doi.org/10.1109/SATE.2016.10
[9]
J. P. d. Reis, F. Brito e Abreu, and G. d. F. Carneiro. 2016. Code Smells Incidence: Does It Depend on the Application Domain?. In 2016 10th International Conference on the Quality of Information and Communications Technology (QUATIC). 172--177. https://doi.org/10.1109/QUATIC.2016.044
[10]
P. Danphitsanuphan and T. Suwantada. 2012. Code Smell Detecting Tool and Code Smell-Structure Bug Relationship. In 2012 Spring Congress on Engineering and Technology. 1--5. https://doi.org/10.1109/SCET.2012.6342082
[11]
K. Dhambri, H. Sahraoui, and P. Poulin. 2008. Visual Detection of Design Anomalies. In 2008 12th European Conference on Software Maintenance and Reengineering. 279--283. https://doi.org/10.1109/CSMR.2008.4493326
[12]
Raimar Falke, Pierre Frenzel, and Rainer Koschke. 2008. Empirical Evaluation of Clone Detection Using Syntax Suffix Trees. Empirical Softw. Engg. 13, 6 (Dec. 2008), 601--643. https://doi.org/10.1007/s10664-008-9073-9
[13]
A. M. Fard and A. Mesbah. 2013. JSNOSE: Detecting JavaScript Code Smells. In 2013 IEEE 13th International Working Conference on Source Code Analysis and Manipulation (SCAM). 116--125. https://doi.org/10.1109/SCAM.2013.6648192
[14]
W. Fenske, S. Schulze, D. Meyer, and G. Saake. 2015. When code smells twice as much: Metric-based detection of variability-aware code smells. In 2015 IEEE 15th International Working Conference on Source Code Analysis and Manipulation (SCAM). 171--180. https://doi.org/10.1109/SCAM.2015.7335413
[15]
Eduardo Fernandes, Johnatan Oliveira, Gustavo Vale, Thanis Paiva, and Eduardo Figueiredo. 2016. A Review-based Comparative Study of Bad Smell Detection Tools. In Proceedings of the 20th International Conference on Evaluation and Assessment in Software Engineering (Limerick, Ireland) (EASE '16). ACM, New York, NY, USA, Article 18, 12 pages. https://doi.org/10.1145/2915970.2915984
[16]
F. A. Fontana, V. Ferme, and S. Spinelli. 2012. Investigating the impact of code smells debt on quality code evaluation. In 2012 Third International Workshop on Managing Technical Debt (MTD). 15--22. https://doi.org/10.1109/MTD.2012.6225993
[17]
F. A. Fontana, V. Ferme, M. Zanoni, and R. Roveda. 2015. Towards a prioritization of code debt: A code smell Intensity Index. In 2015 IEEE 7th International Workshop on Managing Technical Debt (MTD). 16--24. https://doi.org/10.1109/MTD.2015.7332620
[18]
Marting Fowler, Kent Beck, John Brant, William Opdyke, and Don Roberts. 1999. Refactoring: Improving the Design of Existing Code. Addison-Wesley.
[19]
Geoffrey Hecht, Naouel Moha, and Romain Rouvoy. 2016. An Empirical Study of the Performance Impacts of Android Code Smells. In Proceedings of the International Conference on Mobile Software Engineering and Systems (Austin, Texas) (MOBILESoft '16). ACM, New York, NY, USA, 59--69. https://doi.org/10.1145/2897073.2897100
[20]
F. Hermans and E. Aivaloglou. 2016. Do code smells hamper novice programming? A controlled experiment on Scratch programs. In 2016 IEEE 24th International Conference on Program Comprehension (ICPC). 1--10. https://doi.org/10.1109/ICPC.2016.7503706
[21]
A. Kaur, K. Kaur, and S.Jain. 2016. Predicting software change-proneness with code smells and class imbalance learning. In 2016 International Conference on Advances in Computing, Communications and Informatics (ICACCI). 746--754. https://doi.org/10.1109/ICACCI.2016.7732136
[22]
M.Kessentini and A. Ouni. 2017. Detecting Android Smells Using Multi-Objective Genetic Programming. In 2017 IEEE/ACM 4th International Conference on Mobile Software Engineering and Systems (MOBILESoft). 122--132. https://doi.org/10.1109/MOBILESoft.2017.29
[23]
Marouane Kessentini, Stéphane Vaucher, and Houari Sahraoui. 2010. Deviance from Perfection is a Better Criterion Than Closeness to Evil when Identifying Risky Code. In Proceedings of the IEEE/ACM International Conference on Automated Software Engineering (Antwerp, Belgium) (ASE '10). ACM, New York, NY, USA, 113--122. https://doi.org/10.1145/1858996.1859015
[24]
Foutse Khomh, Massimiliano Di Penta, and Yann-Gaël Guéhéneuc. 2009. An Exploratory Study of the Impact of Code Smells on Software Change-proneness. Proceedings - Working Conference on Reverse Engineering, WCRE, 75--84. https://doi.org/10.1109/WCRE.2009.28
[25]
Foutse Khomh, Stéphane Vaucher, Yann gaël Guéhéneuc, and Houari Sahraoui. [n.d.]. A Bayesian Approach for the Detection of Code and Design Smells.
[26]
B. Kitchenham and S Charters. 2007. Guidelines for performing Systematic Literature Reviews in Software Engineering.
[27]
Michele Lanza, Radu Marinescu, and Stéphane Ducasse. 2005. Object-Oriented Metrics in Practice. Springer-Verlag, Berlin, Heidelberg.
[28]
Thierry Lavoie and Ettore Merlo. 2011. Automated Type-3 Clone Oracle Using Levenshtein Metric. In Proceedings of the 5th International Workshop on Software Clones (Waikiki, Honolulu, HI, USA) (IWSC '11). ACM, New York, NY, USA, 34--40. https://doi.org/10.1145/1985404.1985411
[29]
I. Macia, R. Arcoverde, A. Garcia, C. Chavez, and A. von Staa. 2012. On the Relevance of Code Anomalies for Identifying Architecture Degradation Symptoms. In 2012 16th European Conference on Software Maintenance and Reengineering. 277--286. https://doi.org/10.1109/CSMR.2012.35
[30]
Abdou Maiga, Nasir Ali, Neelesh Bhattacharya, Aminata Sabane, Yann-Gaël Guéhéneuc, Giuliano Antoniol, and Esma Aïmeur. 2012. Support vector machines for anti-pattern detection. 2012 Proceedings of the 27th IEEE/ACM International Conference on Automated Software Engineering (2012), 278--281.
[31]
Umme Ayda Mannan, Iftekhar Ahmed, Rana Abdullah M. Almurshed, Danny Dig, and Carlos Jensen. 2016. Understanding Code Smells in Android Applications. In Proceedings of the International Conference on Mobile Software Engineering and Systems (Austin, Texas) (MOBILESoft '16). ACM, New York, NY, USA, 225--234. https://doi.org/10.1145/2897073.2897094
[32]
M. V. Mantyla, J. Vanhanen, and C. Lassenius. 2004. Bad smells - humans as code critics. In 20th IEEE International Conference on Software Maintenance, 2004. Proceedings. 399--408. https://doi.org/10.1109/ICSM.2004.1357825
[33]
R. Marinescu. 2001. Detecting design flaws via metrics in object-oriented systems. In Proceedings 39th International Conference and Exhibition on Technology of Object-Oriented Languages and Systems. TOOLS 39. 173--182.
[34]
Naouel Moha, Yann-Gael Gueheneuc, Laurence Duchien, and Anne-Francoise Le Meur. 2010. DECOR: A Method for the Specification and Detection of Code and Design Smells. IEEE Trans. Softw. Eng. 36, 1 (Jan. 2010), 20--36. https://doi.org/10.1109/TSE.2009.50
[35]
Willian Oizumi, Alessandro Garcia, Leonardo da Silva Sousa, Bruno Cafeo, and Yixue Zhao. 2016. Code Anomalies Flock Together: Exploring Code Anomaly Agglomerations for Locating Design Problems. In Proceedings of the 38th International Conference on Software Engineering (Austin, Texas) (ICSE '16). ACM, New York, NY, USA, 440--451. https://doi.org/10.1145/2884781.2884868
[36]
W. N. Oizumi, A. F. Garcia, T. E. Colanzi, M. Ferreira, and A. v. Staa. 2014. When Code-Anomaly Agglomerations Represent Architectural Problems? An Exploratory Study. In 2014 Brazilian Symposium on Software Engineering. 91--100. https://doi.org/10.1109/SBES.2014.18
[37]
S. Olbrich, D. S. Cruzes, V. Basili, and N. Zazworka. 2009. The evolution and impact of code smells: A case study of two open source systems. In 2009 3rd International Symposium on Empirical Software Engineering and Measurement. 390--400. https://doi.org/10.1109/ESEM.2009.5314231
[38]
Ali Ouni, Marouane Kessentini, Houari Sahraoui, Katsuro Inoue, and Kalyanmoy Deb. 2016. Multi-Criteria Code Refactoring Using Search-Based Software Engineering: An Industrial Case Study. ACM Trans. Softw. Eng. Methodol. 25, 3, Article 23 (June 2016), 53 pages. https://doi.org/10.1145/2932631
[39]
Ali Ouni, Marouane Kessentini, Houari Sahraoui, Katsuro Inoue, and Mohamed Salah Hamdi. 2015. Improving multi-objective code-smells correction using development history. Journal of Systems and Software 105 (2015), 18--39. https://doi.org/10.1016/j.jss.2015.03.040
[40]
F. Palomba. 2015. Textual Analysis for Code Smell Detection. In 2015 IEEE/ACM 37th IEEE International Conference on Software Engineering, Vol. 2. 769--771. https://doi.org/10.1109/ICSE.2015.244
[41]
Fabio Palomba, Gabriele Bavota, Massimiliano Di Penta, Fausto Fasano, Rocco Oliveto, and Andrea De Lucia. 2018. On the diffuseness and the impact on maintainability of code smells: a large scale empirical investigation. Empirical Software Engineering 23, 3 (01 Jun 2018), 1188--1221. https://doi.org/10.1007/s10664-017-9535-z
[42]
Fabio Palomba, Gabriele Bavota, Massimiliano Di Penta, Rocco Oliveto, Andrea De Lucia, and Denys Poshyvanyk. 2013. Detecting bad smells in source code using change history information. 2013 28th IEEE/ACM International Conference on Automated Software Engineering(ASE) (2013), 268--278.
[43]
F. Palomba, G. Bavota, M. D. Penta, R. Oliveto, D. Poshyvanyk, and A. De Lucia. 2015. Mining Version Histories for Detecting Code Smells. IEEE Transactions on Software Engineering 41, 5 (May 2015), 462--489. https://doi.org/10.1109/TSE.2014.2372760
[44]
Fabio Palomba, Dario Di Nucci, Michele Tufano, Gabriele Bavota, Rocco Oliveto, Denys Poshyvanyk, and Andrea De Lucia. 2015. Landfill: An Open Dataset of Code Smells with Public Evaluation. In Proceedings of the 12th Working Conference on Mining Software Repositories (Florence, Italy) (MSR '15). IEEE Press, Piscataway, NJ, USA, 482--485. http://dl.acm.org/citation.cfm?id=2820518.2820593
[45]
Fabio Palomba, Rocco Oliveto, and Andrea De Lucia. 2017. Investigating code smell co-occurrences using association rule learning: A replicated study. 2017 IEEE Workshop on Machine Learning Techniques for Software Quality Evaluation (MaLTeSQuE) (2017), 8--13.
[46]
Fabio Palomba, Annibale Panichella, Andrea De Lucia, Rocco Oliveto, and Andy Zaidman. 2016. A Textual-Based Technique for Smell Detection. In 2016 IEEE 24th International Conference on Program Comprehension (ICPC). 1--10. https://doi.org/10.1109/icpc.2016.7503704 Exported from https://app.dimensions.ai on 2019/02/25.
[47]
F. Palomba, A. Panichella, A. Zaidman, R. Oliveto, and A. De Lucia. 2018. The Scent of a Smell: An Extensive Comparison Between Textual and Structural Smells. IEEE Transactions on Software Engineering 44, 10 (Oct 2018), 977--1000. https://doi.org/10.1109/TSE.2017.2752171
[48]
R. Peters and A. Zaidman. 2012. Evaluating the Lifespan of Code Smells using Software Repository Mining. In 2012 16th European Conference on Software Maintenance and Reengineering. 411--416. https://doi.org/10.1109/CSMR.2012.79
[49]
Dilan Sahin. 2016. A Multi-Level Framework for the Detection, Prioritization and Testing of Software Design Defects.
[50]
Tushar Sharma and Diomidis Spinellis. 2018. A survey on software smells. Journal of Systems and Software 138 (2018), 158--173. https://doi.org/10.1016/j.jss.2017.12.034
[51]
K. Sirikul and C. Soomlek. 2016. Automated detection of code smells caused by null checking conditions in Java programs. In 2016 13th International Joint Conference on Computer Science and Software Engineering(JCSSE). 1--7. https://doi.org/10.1109/JCSSE.2016.7748884
[52]
Elder V. P. Sobrinho, A. Lucia, and M. Maia. 2018. A systematic literature review on bad smells --- 5 W's: which, when, what, who, where. IEEE Transactions on Software Engineering (2018), 1--1.
[53]
Gábor Szőke, Csaba Nagy, Rudolf Ferenc, and Tibor Gyimóthy. 2014. A Case Study of Refactoring Large-Scale Industrial Systems to Efficiently Improve Source Code Quality. In Computational Science and Its Applications - ICCSA 2014, Beniamino Murgante, Sanjay Misra, Ana Maria A. C. Rocha, Carmelo Torre, Jorge Gustavo Rocha, Maria Irene Falcão, David Taniar, Bernady O. Apduhan, and Osvaldo Gervasi (Eds.). Springer International Publishing, Cham, 524--540.
[54]
Gustavo Vale, Danyllo Albuquerque, Eduardo Figueiredo, and Alessandro Garcia. 2015. Defining Metric Thresholds for Software Product Lines: A Comparative Study. In Proceedings of the 19th International Conference on Software Product Line (Nashville, Tennessee) (SPLC '15). ACM, New York, NY, USA, 176--185. https://doi.org/10.1145/2791060.2791078
[55]
G. A. D. Vale and E. M. L. Figueiredo. 2015. A Method to Derive Metric Thresholds for Software Product Lines. In 2015 29th Brazilian Symposium on Software Engineering. 110--119. https://doi.org/10.1109/SBES.2015.9
[56]
B. C. Wagey, B. Hendradjaya, and M. S. Mardiyanto. 2015. A proposal of software maintainability model using code smell measurement. In 2015 International Conference on Data and Software Engineering (ICoDSE). 25--30. https://doi.org/10.1109/ICODSE.2015.7436966
[57]
Claes Wohlin. 2014. Guidelines for Snowballing in Systematic Literature Studies and a Replication in Software Engineering. In Proceedings of the 18th International Conference on Evaluation and Assessment in Software Engineering (London, England, United Kingdom) (EASE '14). ACM, New York, NY, USA, Article 38, 10 pages. https://doi.org/10.1145/2601248.2601268
[58]
Aiko Yamashita. 2014. Assessing the Capability of Code Smells to Explain Maintenance Problems: An Empirical Study Combining Quantitative and Qualitative Data. Empirical Softw. Engg. 19, 4 (Aug. 2014), 1111--1143. https://doi.org/10.1007/s10664-013-9250-3
[59]
Aiko Yamashita and Leon Moonen. 2013. To what extent can maintenance problems be predicted by code smell detection? - An empirical study. Information and Software Technology 55, 12 (2013), 2223--2242. https://doi.org/10.1016/j.infsof.2013.08.002
[60]
A. Yamashita, M. Zanoni, F. A. Fontana, and B. Walter. 2015. Inter-smell relations in industrial and open source systems: A replication and comparative analysis. In 2015 IEEE International Conference on Software Maintenance and Evolution (ICSME). 121--130. https://doi.org/10.1109/ICSM.2015.7332458
[61]
Min Zhang, Tracy Hall, and Nathan Baddoo. 2011. Code Bad Smells: A Review of Current Knowledge. J. Softw. Maint. Evol. 23, 3 (April 2011), 179--202. https://doi.org/10.1002/smr.521
[62]
X. Zhao, X. Xuan, and S. Li. 2015. An Empirical Study of Long Method and God Method in Industrial Projects. In 2015 30th IEEE/ACM International Conference on Automated Software Engineering Workshop (ASEW). 109--114. https://doi.org/10.1109/ASEW.2015.15

Cited By

View all
  • (2024)Application of Deep Learning for Code Smell Detection: Challenges and OpportunitiesSN Computer Science10.1007/s42979-024-02956-55:5Online publication date: 3-Jun-2024
  • (2024)Multi-label learning for identifying co-occurring class code smellsComputing10.1007/s00607-024-01294-x106:8(2585-2612)Online publication date: 27-May-2024
  • (2023)A Systematic Literature Review on the Code Smells Datasets and Validation MechanismsACM Computing Surveys10.1145/359690855:13s(1-48)Online publication date: 13-Jul-2023

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Other conferences
SBES '20: Proceedings of the XXXIV Brazilian Symposium on Software Engineering
October 2020
901 pages
ISBN:9781450387538
DOI:10.1145/3422392
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

In-Cooperation

  • SBC: Brazilian Computer Society

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 21 December 2020

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. bad smell
  2. benchmark
  3. code smell
  4. design anomaly
  5. oracle
  6. systematic literature review

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Conference

SBES '20

Acceptance Rates

Overall Acceptance Rate 147 of 427 submissions, 34%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)10
  • Downloads (Last 6 weeks)1
Reflects downloads up to 21 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Application of Deep Learning for Code Smell Detection: Challenges and OpportunitiesSN Computer Science10.1007/s42979-024-02956-55:5Online publication date: 3-Jun-2024
  • (2024)Multi-label learning for identifying co-occurring class code smellsComputing10.1007/s00607-024-01294-x106:8(2585-2612)Online publication date: 27-May-2024
  • (2023)A Systematic Literature Review on the Code Smells Datasets and Validation MechanismsACM Computing Surveys10.1145/359690855:13s(1-48)Online publication date: 13-Jul-2023

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media