Nothing Special   »   [go: up one dir, main page]

Skip to main content
Log in

Comparative analysis of the syntactic and semantic consistency of terms in software testing glossaries

  • Research
  • Published:
Software Quality Journal Aims and scope Submit manuscript

Abstract

This paper addresses terminological consistency issues of three software testing glossaries used in academia and industry. The evaluation focus mainly deals with a sub-characteristic of information quality such as consistency, which includes syntactic and semantic consistency. To systematically conduct this study, we have established a set of activities or steps. These include defining the evaluation goal and scope, selecting the glossaries, conceiving the terminological categories, classifying the glossary terms into categories, calculating syntactic and semantic similarities, analyzing consistency, and making recommendations. For instance, for the testing domain, eight terminological categories were conceived, in which, for each selected glossary, a corresponding term is included in a category, considering the semantics intended by the authors of these standard documents. To count the occurrence frequency of a term in the glossaries, a tool was built that also takes into account the matching of synonyms. Then, a comparative analysis of syntactic and semantic consistency was carried out for all the terms ending in the word “testing,” which enables us to give recommendations. This exploratory study identifies some inconsistencies that might deserve further attention and efforts to promote agreement and harmonization among the authors/editors of these glossaries in order to provide their readers with the most consistent and easiest way to understand and learn software testing concepts.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

Data availability

All data generated or analyzed during this study are included in this published article and its supplementary information files linked at https://bit.ly/SQJ_Appendices.

References

  • Arnicane, V., Arnicans, G., & Borzovs, J. (2016). Building of concept system to improve systematic collection of terminology. Frontiers in Artificial Intelligence and Applications, 291, 313–326. https://doi.org/10.3233/978-1-61499-714-6-313

    Article  Google Scholar 

  • Becker, P., Papa, M. F., Tebes, G., & Olsina, L. (2022). Discussing the applicability of a process core ontology and aspects of its internal quality. Software Quality Journal, 30(4), 1003–1038. Springer. https://doi.org/10.1007/s11219-022-09592-3

  • Henderson-Sellers, B., Gonzalez-Perez, C., McBride, T., & Low, G. (2014). An ontology for ISO software engineering standards: 1) Creating the infrastructure. Computer Standards & Interfaces, 36(3), 563–576. https://doi.org/10.1016/j.csi.2013.11.001

    Article  Google Scholar 

  • ISO/IEC/IEEE 29119–1. (2013). Software and systems engineering – Software testing – Part 1: Concepts and definitions.

  • ISO/IEC/IEEE 29119–1. (2022). Software and systems engineering – Software testing – Part 1: General concepts.

  • ISTQB®, International Software Testing Qualifications Board. (2021). Standard glossary of terms used in software testing, version 3.5.

  • ISTQB®, International Software Testing Qualifications Board. (2022). Standard glossary of terms used in software testing, version 3.6.1. Automatically generated on September 26, 2022 from https://glossary.istqb.org/en/reports

  • Kuļešovs, I., Arnicane, V., Arnicans, G., & Borzovs, J. (2013). Inventory of testing ideas and structuring of testing terms. Baltic Journal of Modern Computing, 1(3–4), 210–227.

    Google Scholar 

  • Olsina, L., Lew, P., Dieser, A., & Rivera, B. (2012). Updating quality models for evaluating new generation web applicationS. Journal of Web Engineering, Special issue: Quality in new generation Web applications, Abrahão S., Cachero C., Cappiello C., Matera M. (Eds.), Rinton Press, USA, 11:(3), pp. 209–246.

  • Olsina, L., Lew, P., & Tebes, G. (2022). Analyzing quality issues from software testing glossaries used in academia and industry. In: Vallecillo, A., Visser, J., Pérez-Castillo, R. (eds) Quality of Information and Communications Technology. QUATIC 2022. Communications in Computer and Information Science, Springer, 1621, 140–155, https://doi.org/10.1007/978-3-031-14179-9_10

  • Rout, T. P. (1999). Consistency and conflict in terminology in software engineering standards. 4th IEEE International Software Engineering Standards Symposium and Forum (ISESS'99), Best Software Practices for the Internet Age, 67–74.

  • Souza, E. F., Falbo, R. A., & Vijaykumar, N. L. (2013). Ontologies in software testing: A systematic literature review. CEUR Workshop Proceedings, 1041, 71–82.

    Google Scholar 

  • Souza, E. F., Falbo, R. A., & Vijaykumar, N. L. (2017). ROoST: Reference ontology on software testing. Applied Ontology, 12(1), 59–90.

    Article  Google Scholar 

  • Tebes, G., Peppino, D., Becker, P., Matturro, G., Solari, M., & Olsina, L. (2020). Analyzing and documenting the systematic review results of software testing ontologies. Information and Software Technology, 123https://doi.org/10.1016/j.infsof.2020.106298

  • Tebes, G., Olsina, L., Peppino, D., & Becker, P. (2021). Specifying and analyzing a software testing ontology at the top-domain ontological level. Journal of Computer Science & Technology (JCS&T), 21(2), 126–145. https://doi.org/10.24215/16666038.21.e12

    Article  Google Scholar 

  • TMMi Foundation. (2018). Test Maturity Model Integration (TMMi®) - Guidelines for test process improvement, release 1.2.

Download references

Funding

This line of research is supported partially by the Engineering School at Universidad Nacional de La Pampa, Argentina, in project 09-F079.

Author information

Authors and Affiliations

Authors

Contributions

Luis Olsina contributed to the updated exploratory study conception and design. Data collection and analysis were performed by Pablo Becker and Luis Olsina. The first draft of the manuscript was written by Luis Olsina and all authors commented on previous versions of the manuscript. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Luis Olsina.

Ethics declarations

Conflict of interest

The authors declare no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Olsina, L., Lew, P. & Becker, P. Comparative analysis of the syntactic and semantic consistency of terms in software testing glossaries. Software Qual J 32, 27–52 (2024). https://doi.org/10.1007/s11219-023-09638-0

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11219-023-09638-0

Keywords

Navigation