Nothing Special   »   [go: up one dir, main page]

skip to main content
research-article

A Systematic Literature Review of Empiricism and Norms of Reporting in Computing Education Research Literature

Published: 18 October 2021 Publication History

Abstract

Context. Computing Education Research (CER) is critical to help the computing education community and policy makers support the increasing population of students who need to learn computing skills for future careers. For a community to systematically advance knowledge about a topic, the members must be able to understand published work thoroughly enough to perform replications, conduct meta-analyses, and build theories. There is a need to understand whether published research allows the CER community to systematically advance knowledge and build theories.
Objectives. The goal of this study is to characterize the reporting of empiricism in Computing Education Research literature by identifying whether publications include content necessary for researchers to perform replications, meta-analyses, and theory building. We answer three research questions related to this goal: (RQ1) What percentage of papers in CER venues have some form of empirical evaluation? (RQ2) Of the papers that have empirical evaluation, what are the characteristics of the empirical evaluation? (RQ3) Of the papers that have empirical evaluation, do they follow norms (both for inclusion and for labeling of information needed for replication, meta-analysis, and, eventually, theory-building) for reporting empirical work?
Methods. We conducted a systematic literature review of the 2014 and 2015 proceedings or issues of five CER venues: Technical Symposium on Computer Science Education (SIGCSE TS), International Symposium on Computing Education Research (ICER), Conference on Innovation and Technology in Computer Science Education (ITiCSE), ACM Transactions on Computing Education (TOCE), and Computer Science Education (CSE). We developed and applied the CER Empiricism Assessment Rubric to the 427 papers accepted and published at these venues over 2014 and 2015. Two people evaluated each paper using the Base Rubric for characterizing the paper. An individual person applied the other rubrics to characterize the norms of reporting, as appropriate for the paper type. Any discrepancies or questions were discussed between multiple reviewers to resolve.
Results. We found that over 80% of papers accepted across all five venues had some form of empirical evaluation. Quantitative evaluation methods were the most frequently reported. Papers most frequently reported results on interventions around pedagogical techniques, curriculum, community, or tools. There was a split in papers that had some type of comparison between an intervention and some other dataset or baseline. Most papers reported related work, following the expectations for doing so in the SIGCSE and CER community. However, many papers were lacking properly reported research objectives, goals, research questions, or hypotheses; description of participants; study design; data collection; and threats to validity. These results align with prior surveys of the CER literature.
Conclusions. CER authors are contributing empirical results to the literature; however, not all norms for reporting are met. We encourage authors to provide clear, labeled details about their work so readers can use the study methodologies and results for replications and meta-analyses. As our community grows, our reporting of CER should mature to help establish computing education theory to support the next generation of computing learners.

References

[1]
Quintin Cutts, Beth Simon, and Brian Dorn. 2014. Proceedings of the 10th Annual Conference on International Computing Education Research. Association for Computing Machinery, New York, NY.
[2]
2015. Proceedings of the 11th Annual International Conference on International Computing Education Research. Association for Computing Machinery, New York, NY.
[3]
ACM. 2020. Artifact Review and Badging – Current. Retrieved from https://www.acm.org/publications/policies/artifact-review-and-badging-current.
[4]
Alireza Ahadi, Arto Hellas, Petri Ihantola, Ari Korhonen, and Andrew Petersen. 2016. Replication in computing education research: Researcher attitudes and experiences. In Proceedings of the 16th Koli Calling International Conference on Computing Education Research (Koli Calling’16). ACM, New York, NY, 2–11. DOI:DOI:https://doi.org/10.1145/2999541.2999554
[5]
Ahmed Al-Zubidy, Jeffrey C. Carver, Sarah Heckman, and Mark Sherriff. 2016. A (updated) review of empiricism at the SIGCSE Technical Symposium. In Proceedings of the 47th ACM Technical Symposium on Computing Science Education (SIGCSE’16). Association for Computing Machinery, New York, NY, 120–125. DOI:DOI:https://doi.org/10.1145/2839509.2844601
[6]
American Educational Research Association. 2006. Standard for reporting on empirical social science research in AERA publications. Educ. Res. 35, 6 (2006), 33–40. DOI:DOI:
[7]
American Psychological Association. 2018. Journal Reporting Standards (JARS). Retrieved from https://apastyle.apa.org/jars.
[8]
Cathy Bishop-Clark and Beth Dietz-Uhler. 2012. Engaging in the Scholarship of Teaching and Learning. Stylus Publishing, Sterling, VA.
[9]
Paulo Blikstein and Sepi Hejazi Moghadam. 2019. Computing Education Literature Review and Voices from the Field. Cambridge University Press, 56–78. DOI:DOI:
[10]
Jeffrey C. Carver. 2010. Towards reporting guidelines for experimental replications: A proposal. In Proceedings of the 1st International Workshop on Replication in Empirical Software Engineering, Vol. 1. Citeseer, 1–4.
[11]
Tony Clear. 2006. Valuing computer science education research? In Proceedings of the 6th Baltic Sea Conference on Computing Education Research: Koli Calling 2006 (Baltic Sea’06). ACM, New York, NY, 8–18. DOI:DOI:https://doi.org/10.1145/1315803.1315806
[12]
What Works Clearinghouse. 2020. What Works Clearinghouse Standards Handbook, Version 4.1. Retrieved from https://ies.ed.gov/ncee/wwc/Handbooks.
[13]
John W. Creswell and J. David Creswell. 2018. Research Design: Qualitative, Quantitative, and Mixed Methods Approaches (5th ed.). SAGE Publications, Inc.
[14]
Mats Daniels and Arnold Pears. 2012. Models and methods for computing education research. In Proceedings of the 14th Australasian Computing Education Conference (ACE’12). Australian Computer Society, Inc., AUS, 95–102.
[15]
Adrienne Decker and Monica M. McGill. 2019. A topical review of evaluation instruments for computing education. In Proceedings of the 50th ACM Technical Symposium on Computer Science Education (SIGCSE’19). Association for Computing Machinery, New York, NY, 558–564. DOI:DOI:https://doi.org/10.1145/3287324.3287393
[16]
Sally Fincher and Marian Petre (Eds.). 2004. Computer Science Education Research. Taylor & Francis.
[17]
Sally A. Fincher, Josh Tenenberg, Brian Dorn, Christopher Hundhausen, Robert McCartney, and Laurie Murphy. 2019. Computing Education Research Today. Cambridge University Press, 40–55. DOI:DOI:
[18]
Institute for Education Science and the National Science Foundation. 2013. Common Guidelines for Education Research and Development. Retrieved from https://www.nsf.gov/pubs/2013/nsf13126/nsf13126.pdf.
[19]
The National Science Foundation and the Institute of Education Sciences. 2018. Companion Guidelines on Replication & Reproducibility in Education Research: A Supplement to the Common Guidelines for Education Research and Development. Retrieved from https://www.nsf.gov/pubs/2019/nsf19022/nsf19022.pdf.
[20]
Varvara Garneli, Michail N. Giannakos, and Konstantinos Chorianopoulos. 2015. Computing education in K–12 schools: A review of the literature. In Proceedings of the IEEE Global Engineering Education Conference (EDUCON’15). IEEE, 543–551. DOI:DOI:
[21]
Omar S. Gómez, Natalia Juristo, and Sira Vegas. 2010. Replication, reproduction and re-analysis: Three ways for verifying experimental findings. In Proceedings of the 1st International Workshop on Replication in Empirical Software Engineering Research (RESER’10).
[22]
Mark Guzdial and Benedict du Boulay. 2019. The History of Computing Education Research. Cambridge University Press, 11–39. DOI:DOI:
[23]
A. B. Haidich. 2010. Meta-analysis in medical research. Hippokratia 14 (Suppl 1) (2010), 29–37.
[24]
Qiang Hao, David H. Smith IV, Naitra Iriumi, Michail Tsikerdekis, and Amy J. Ko. 2019. A systematic investigation of replications in computing education research. ACM Trans. Comput. Educ. 19, 4 (Aug. 2019). DOI:DOI:https://doi.org/10.1145/3345328
[25]
Elizabeth K. Hawthorne, Manuel A. Pérez-Quiñones, Sarah Heckman, and Jian Zhang. 2019. SIGCSE Technical Symposium 2019 report. SIGCSE Bull. 51, 2 (Apr. 2019), 2–4. DOI:DOI:https://doi.org/10.1145/3329103.3329104
[26]
Orit Hazzan, Yael Dubinsky, Larisa Eidelman, Victoria Sakhnini, and Mariana Teif. 2006. Qualitative research in computer science education. ACM SIGCSE Bull. 38, 1 (2006), 408–412.
[27]
Sarah Heckman, Jeffrey C. Carver, Mark Sherriff, and Ahmed Al-Zubidy. 2021. CER Empiricism Assessment Dataset (2014 & 2015). (June 2021). DOI:DOI:
[28]
Christian Holmboe, Linda McIver, and Carlisle E. George. 2001. Research agenda for computer science education. In Proceedings of the 13th Psychology of Programming Interest Group (PPIG) Workshop. Retrieved from https://www.ppig.org/files/2001-PPIG-13th-holmboe.pdf.
[29]
Petri Ihantola, Arto Vihavainen, Alireza Ahadi, Matthew Butler, Jürgen Börstler, Stephen H. Edwards, Essi Isohanni, Ari Korhonen, Andrew Petersen, Kelly Rivers, et al. 2015. Educational data mining and learning analytics in programming: Literature review and case studies. In Proceedings of the ITiCSE Working Group Reports. 41–63. DOI:DOI:https://doi.org/10.1145/2858796.2858798
[30]
John Impagliazzo, Ming Zhang, and Xi Wu. 2018. SIGCSE launches new conference on a global scale. SIGCSE Bull. 50, 4 (Oct. 2018), 2–3. DOI:DOI:https://doi.org/10.1145/3287087.3287088
[31]
Andreas Jedlitschka and Dietmar Pfahl. 2005. Reporting guidelines for controlled experiments in software engineering. In Proceedings of the International Symposium on Empirical Software Engineering. IEEE. DOI:DOI:
[32]
Mike Joy, Jane Sinclair, Shanghua Sun, Jirarat Sitthiworachart, and Javier López-González. 2009. Categorising computer science education research. Educ. Inf. Technol. 14, 2 (2009), 105–126. DOI:DOI:https://doi.org/10.1007/s10639-008-9078-4
[33]
Päivi Kinnunen, Veijo Meisalo, and Lauri Malmi. 2010. Have we missed something?: Identifying missing types of research in computing education. In Proceedings of the 6th International Workshop on Computing Education Research. ACM, 13–22. DOI:DOI:https://doi.org/10.1145/1839594.1839598
[34]
Barbara Kitchenham. 2004. Procedures for Performing Systematic Reviews. Joint Technical Report, Keele University Technical Report TR/SE-0401, NICTA Technical Report 040011T.1. Retrieved from https://www.inf.ufsc.br/aldo.vw/kitchenham.pdf.
[35]
Barbara Kitchenham, Hiyam Al-Khilidar, Muhammed Ali Babar, Mike Berry, Karl Cox, Jacky Keung, Felicia Kurniawati, Mark Staples, He Zhang, and Liming Zhu. 2008. Evaluating guidelines for reporting empirical software engineering studies. Empir. Softw. Eng. 13, 1 (2008), 97–121. DOI:DOI:https://doi.org/10.1007/s10664-007-9053-5
[36]
Alex Lishinski, Jon Good, Phil Sands, and Aman Yadav. 2016. Methodological rigor and theoretical foundations of CS education research. In Proceedings of the ACM Conference on International Computing Education Research. 161–169. DOI:DOI:https://doi.org/10.1145/2960310.2960328
[37]
Andrew Luxton-Reilly, Ibrahim Albluwi, Brett A. Becker, Michail Giannakos, Amruth N. Kumar, Linda Ott, James Paterson, Michael James Scott, Judy Sheard, and Claudia Szabo. 2018. Introductory programming: A systematic literature review. In Proceedings of the 23rd Annual ACM Conference on Innovation and Technology in Computer Science Education. 55–106. DOI:DOI:https://doi.org/10.1145/3293881.3295779
[38]
Lauri Malmi, Judy Sheard, Päivi Kinnunen, Simon, and Jane Sinclair. 2020. Theories and models of emotions, attitudes, and self-efficacy in the context of programming education. In Proceedings of the ACM Conference on International Computing Education Research. 36–47. DOI:DOI:https://doi.org/10.1145/3372782.3406279
[39]
Lauri Malmi, Judy Sheard, Simon, Roman Bednarik, Juha Helminen, Päivi Kinnunen, Ari Korhonen, Niko Myller, Juha Sorva, and Ahmad Taherkhani. 2014. Theoretical underpinnings of computing education research: What is the evidence? In Proceedings of the 10th Annual Conference on International Computing Education Research. 27–34. DOI:DOI:https://doi.org/10.1145/2632320.2632358
[40]
Lauri Malmi, Simon, Judy Sheard, Roman Bednarik, Juha Helminen, Ari Korhonen, Niko Myller, Juha Sorva, Ahmad Taherkhani, et al. 2010. Characterizing research in computing education: A preliminary analysis of the literature. In Proceedings of the 6th International Workshop on Computing Education Research. ACM, 3–12. DOI:DOI:https://doi.org/10.1145/1839594.1839597
[41]
Lauren Margulieux, Tuba Ayer Ketenci, and Adrienne Decker. 2019. Review of measurements used in computing education research and suggestions for increasing standardization. Comput. Sci. Educ. 29, 1 (2019), 49–78. DOI:DOI:
[42]
Monica M. McGill. 2019. Discovering empirically-based best practices in computing education through replication, reproducibility, and meta-analysis studies. In Proceedings of the 19th Koli Calling International Conference on Computing Education Research (Koli Calling’19). Association for Computing Machinery, New York, NY. DOI:DOI:https://doi.org/10.1145/3364510.3364528
[43]
Monica M. McGill and Adrienne Decker. 2018. Defining requirements for a repository to meet the needs of K–12 computer science educators, researchers, and evaluators. In Proceedings of the IEEE Frontiers in Education Conference (FIE’18). 1–9. DOI:DOI:
[44]
Monica M. McGill and Adrienne Decker. 2020. Construction of a taxonomy for tools, languages, and environments across computing education. In Proceedings of the ACM Conference on International Computing Education Research. 124–135. DOI:DOI:https://doi.org/10.1145/3372782.3406258
[45]
Monica M. McGill, Adrienne Decker, and Zachary Abbott. 2018. Improving research and experience reports of pre-college computing activities: A gap analysis. In Proceedings of the 49th ACM Technical Symposium on Computer Science Education. ACM, 964–969. DOI:DOI:https://doi.org/10.1145/3159450.3159481
[46]
Rodrigo Pessoa Medeiros, Geber Lisboa Ramalho, and Taciana Pontual Falcão. 2018. A systematic literature review on teaching and learning introductory programming in higher education. IEEE Trans. Educ. 62, 2 (2018), 77–90. DOI:DOI:
[47]
Diba Mirza, Phillip T. Conrad, Christian Lloyd, Ziad Matni, and Arthur Gatin. 2019. Undergraduate teaching assistants in computer science: A systematic literature review. In Proceedings of the ACM Conference on International Computing Education Research. 31–40. DOI:DOI:https://doi.org/10.1145/3291279.3339422
[48]
Greg L. Nelson and Amy J. Ko. 2018. On use of theory in computing education research. In Proceedings of the ACM Conference on International Computing Education Research. 31–39. DOI:DOI:https://doi.org/10.1145/3230977.3230992
[49]
Brian A. Nosek and Daniël Lakens. 2014. Registered reports: A method to increase the credibility of published results. Soc. Psychol. 45, 3 (2014), 137–141. DOI:DOI:
[50]
National Academies of Sciences, Engineering, and Medicine. 2018. Assessing and Responding to the Growth of Computer Science Undergraduate Enrollments. The National Academies Press, Washington, DC. DOI:DOI:
[51]
Zacharoula Papamitsiou, Michail Giannakos, Simon, and Andrew Luxton-Reilly. 2020. Computing education research landscape through an analysis of keywords. In Proceedings of the ACM Conference on International Computing Education Research. 102–112. DOI:DOI:https://doi.org/10.1145/3372782.3406276
[52]
Arnold Pears, Stephen Seidman, Crystal Eney, Päivi Kinnunen, and Lauri Malmi. 2005. Constructing a core literature for computing education research. ACM SIGCSE Bull. 37, 4 (2005), 152–161. DOI:DOI:https://doi.org/10.1145/1113847.1113893
[53]
Arnold Pears, Stephen Seidman, Lauri Malmi, Linda Mannila, Elizabeth Adams, Jens Bennedsen, Marie Devlin, and James Paterson. 2007. A survey of literature on the teaching of introductory programming. In Working Group Reports on ITiCSE on Innovation and Technology in Computer Science Education (ITiCSE-WGR’07). Association for Computing Machinery, New York, NY, 204–223. DOI:DOI:https://doi.org/10.1145/1345443.1345441
[54]
Manuel A. Perez-Quinones and Elizabeth K. Hawthorne. 2017. Multiple paper types for SIGCSE 2018. SIGCSE Bull. 49, 3 (July 2017), 7–7. DOI:DOI:https://doi.org/10.1145/3129166.3129172
[55]
James Prather, Brett A. Becker, Michelle Craig, Paul Denny, Dastyni Loksa, and Lauren Margulieux. 2020. What do we think we think we are doing? Metacognition and self-regulation in programming. In Proceedings of the ACM Conference on International Computing Education Research. 2–13. DOI:DOI:https://doi.org/10.1145/3372782.3406263
[56]
Justus J. Randolph, George Julnes, Erkki Sutinen, and Steve Lehman. 2008. A methodological review of computer science education research. J. Inf. Technol. Educ.: Res. 7 (2008), 135–162. DOI:DOI:https://doi.org/10.28945/183
[57]
Per Runeson and Martin Höst. 2009. Guidelines for conducting and reporting case study research in software engineering. Empir. Softw. Eng. 14, 2 (2009), 131. DOI:DOI:https://doi.org/10.1007/s10664-008-9102-8
[58]
Kate Sanders, Judy Sheard, Brett A. Becker, Anna Eckerdal, Sally Hamouda, and Simon. 2019. Inferential statistics in computing education research: A methodological review. In Proceedings of the International Computing Education Research Conference (ICER’19). 177–185. DOI:DOI:https://doi.org/10.1145/3291279.3339408
[59]
Stefan Schmidt. 2009. Shall we really do it again? The powerful concept of replication is neglected in the social sciences. Rev. Gen. Psychol. 13, 2 (2009), 90–100. DOI:DOI:
[60]
Kenneth F. Schulz, Douglas G. Altman, and David Moher. 2010. CONSORT 2010 statement: Updated guidelines for reporting parallel group randomised trials. BMJ 340 (2010). DOI:DOI:
[61]
Judy Sheard, Simon, Margaret Hamilton, and Jan Lönnberg. 2009. Analysis of research into the teaching and learning of programming. In Proceedings of the 5th International Workshop on Computing Education Research. ACM, 93–104. DOI:DOI:https://doi.org/10.1145/1584322.1584334
[62]
Simon. 2007. A classification of recent Australasian computing education publications. Comput. Sci. Educ. 17, 3 (2007), 155–169. DOI:DOI:
[63]
Simon. 2007. Koli calling comes of age: An analysis. In Proceedings of the 7th Baltic Sea Conference on Computing Education Research (Koli Calling’07). Australian Computer Society, Inc., AUS, 119–126.
[64]
Simon. 2009. Ten years of the Australasian Computing Education Conference. In Proceedings of the 11th Australasian Conference on Computing Education (ACE’09). Australian Computer Society, Inc., AUS, 157–164.
[65]
Simon. 2020. Twenty-two years of ACE. In Proceedings of the 22nd Australasian Computing Education Conference (ACE’20). Association for Computing Machinery, New York, NY, 203–210. DOI:DOI:https://doi.org/10.1145/3373165.3373188
[66]
Simon, Angela Carbone, Michael de Raadt, Raymond Lister, Margaret Hamilton, and Judy Sheard. 2008. Classifying computing education papers: Process and results. In Proceedings of the 4th International Workshop on Computing Education Research. 161–172. DOI:DOI:https://doi.org/10.1145/1404520.1404536
[67]
Simon and Judy Sheard. 2020. Twenty-four years of ITiCSE papers. In Proceedings of the ACM Conference on Innovation and Technology in Computer Science Education. 5–11. DOI:DOI:https://doi.org/10.1145/3341525.3387407
[68]
Simon, Judithe Irene Sheard, Angela Carbone, Michael de Raadt, Margaret Hamilton, Raymond Lister, and Errol Thompson. 2008. Eight years of computing education papers at NACCQ. National Advisory Committee on Computing Qualifications (2008). http://hdl.handle.net/10453/12628
[69]
Claudia Szabo, Nickolas Falkner, Andrew Petersen, Heather Bort, Kathryn Cunningham, Peter Donaldson, Arto Hellas, James Robinson, and Judy Sheard. 2019. Review and use of learning theories within computer science education research: Primer for researchers and practitioners. In Proceedings of the Working Group Reports on Innovation and Technology in Computer Science Education. 89–109. DOI:DOI:https://doi.org/10.1145/3344429.3372504
[70]
Claudia Szabo, Judy Sheard, Andrew Luxton-Reilly, Simon, Brett A. Becker, and Linda Ott. 2019. Fifteen years of introductory programming in schools: A global overview of K–12 initiatives. In Proceedings of the 19th Koli Calling International Conference on Computing Education Research. 1–9. DOI:DOI:https://doi.org/10.1145/3364510.3364513
[71]
Josh Tenenberg. 2014. Asking research questions: Theoretical presuppositions. Trans. Comput. Educ. 14, 3 (Sept. 2014). DOI:DOI:https://doi.org/10.1145/2644924
[72]
David W. Valentine. 2004. CS educational research: A meta-analysis of SIGCSE Technical Symposium Proceedings. ACM SIGCSE Bull. 36, 1 (2004), 255–259. DOI:DOI:https://doi.org/10.1145/1028174.971391
[73]
Arto Vihavainen, Jonne Airaksinen, and Christopher Watson. 2014. A systematic review of approaches for teaching introductory programming and their influence on success. In Proceedings of the 10th Annual Conference on International Computing Education Research. 19–26. DOI:DOI:https://doi.org/10.1145/2632320.2632349
[74]
Jian Zhang and Mark Sherriff. 2020. SIGCSE 2020 recap. SIGCSE Bull. 52, 2 (Apr. 2020), 4–7. DOI:DOI:https://doi.org/10.1145/3397568.3397570
[75]
Stuart Zweben and Betsy Bizot. 2020. 2019 Taulbee Survey. Comput. Res. News 32, 5 (2020).

Cited By

View all
  • (2024)Teaching Ethics in Computing: A Systematic Literature Review of ACM Computer Science Education PublicationsACM Transactions on Computing Education10.1145/363468524:1(1-36)Online publication date: 14-Jan-2024
  • (2024)Exploring a Justice-Oriented Approach to an App Development Club: A Middle School YPAR ProjectProceedings of the 2024 ACM Conference on International Computing Education Research - Volume 210.1145/3632621.3671420(574-577)Online publication date: 12-Aug-2024
  • (2024)Using Benchmarking Infrastructure to Evaluate LLM Performance on CS Concept Inventories: Challenges, Opportunities, and CritiquesProceedings of the 2024 ACM Conference on International Computing Education Research - Volume 110.1145/3632620.3671097(452-468)Online publication date: 12-Aug-2024
  • Show More Cited By

Index Terms

  1. A Systematic Literature Review of Empiricism and Norms of Reporting in Computing Education Research Literature

      Recommendations

      Comments

      Please enable JavaScript to view thecomments powered by Disqus.

      Information & Contributors

      Information

      Published In

      cover image ACM Transactions on Computing Education
      ACM Transactions on Computing Education  Volume 22, Issue 1
      March 2022
      258 pages
      EISSN:1946-6226
      DOI:10.1145/3487993
      • Editor:
      • Amy J. Ko
      Issue’s Table of Contents

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 18 October 2021
      Accepted: 01 June 2021
      Received: 01 September 2019
      Published in TOCE Volume 22, Issue 1

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. Systematic literature review
      2. empiricism
      3. computing education research

      Qualifiers

      • Research-article
      • Refereed

      Funding Sources

      • National Science Foundation

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)300
      • Downloads (Last 6 weeks)30
      Reflects downloads up to 12 Feb 2025

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)Teaching Ethics in Computing: A Systematic Literature Review of ACM Computer Science Education PublicationsACM Transactions on Computing Education10.1145/363468524:1(1-36)Online publication date: 14-Jan-2024
      • (2024)Exploring a Justice-Oriented Approach to an App Development Club: A Middle School YPAR ProjectProceedings of the 2024 ACM Conference on International Computing Education Research - Volume 210.1145/3632621.3671420(574-577)Online publication date: 12-Aug-2024
      • (2024)Using Benchmarking Infrastructure to Evaluate LLM Performance on CS Concept Inventories: Challenges, Opportunities, and CritiquesProceedings of the 2024 ACM Conference on International Computing Education Research - Volume 110.1145/3632620.3671097(452-468)Online publication date: 12-Aug-2024
      • (2023)Generative AI in Computing Education: Perspectives of Students and Instructors2023 IEEE Frontiers in Education Conference (FIE)10.1109/FIE58773.2023.10343467(1-9)Online publication date: 18-Oct-2023
      • (2023)An Examination of Empirical Evidence Produced by a Decade of K-12 Computer Science Education Research2023 IEEE Frontiers in Education Conference (FIE)10.1109/FIE58773.2023.10342948(1-9)Online publication date: 18-Oct-2023
      • (2023)A scoping review of research exploring teachers’ experiences with Digital Technologies curriculaJournal of Research on Technology in Education10.1080/15391523.2023.221178056:6(733-751)Online publication date: 11-May-2023
      • (2023)The Evolution of Computing Education Research: A Meta-Analytic PerspectivePast, Present and Future of Computing Education Research10.1007/978-3-031-25336-2_4(51-77)Online publication date: 5-Jan-2023
      • (2022)Parsons Problems and BeyondProceedings of the 2022 Working Group Reports on Innovation and Technology in Computer Science Education10.1145/3571785.3574127(191-234)Online publication date: 27-Dec-2022
      • (2022)Launching Registered Report Replications in Computer Science Education ResearchProceedings of the 2022 ACM Conference on International Computing Education Research - Volume 110.1145/3501385.3543971(309-322)Online publication date: 3-Aug-2022
      • (2022)Good Students are Good Students Student Achievement with Visual versus Textual Programming2022 IEEE Frontiers in Education Conference (FIE)10.1109/FIE56618.2022.9962693(1-9)Online publication date: 8-Oct-2022
      • Show More Cited By

      View Options

      Login options

      Full Access

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Full Text

      View this article in Full Text.

      Full Text

      HTML Format

      View this article in HTML Format.

      HTML Format

      Figures

      Tables

      Media

      Share

      Share

      Share this Publication link

      Share on social media