Nothing Special   »   [go: up one dir, main page]

skip to main content
research-article

Assessment of Code, Which Aspects Do Teachers Consider and How Are They Valued?

Published: 15 September 2022 Publication History

Abstract

In many countries, computer programming is becoming an integral part of the secondary school curriculum. However, many teachers, especially in the first years of Flemish secondary school, have limited experience with teaching programming. To improve their knowledge about programming, many different types of professional development programs have been proposed. Nevertheless, these programs mostly focus on technical skills and less on pedagogical skills. One aspect that is often overlooked in these programs is how teachers can assess code. To get insight into what teachers currently value when assessing code, we designed an experiment that analyzes the different aspects teachers consider during the assessment of code. During the experiment, the teachers (N=13) assess a set of programs from five different fictional learners. After the assessment, they participated in a semi-structured interview, giving us insight into the assessment process. We evaluated the transcripts of the interviews using deductive thematic analysis using a coding schema defining the different aspects of code that can be assessed. Additionally, we linked the assessment strategies of teachers to their teaching experience. Our results indicate that many teachers are unaware of the different concepts that can be part of the assessment of code, which might lead to inaccurate or invalid feedback. Moreover, although our experimental group was too small to draw hard conclusions about the inter-case results, our results indicate that the number of concepts considered by teachers seems to increase with experience. These results provide an initial insight into the code assessment practices of teachers and reveals interesting pathways for future research into the assessment of code.

References

[1]
Elizabeth H. Baker. 2014. Socioeconomic status, definition. Wiley Blackw. Encyc. Health, Illn., Behav. Societ. (2014), 2210–2214.
[2]
Sarah Elsie Baker and Rosalind Edwards. 2012. How many qualitative interviews is enough. https://eprints.ncrm.ac.uk/id/eprint/2273/4/how_many_interviews.pdf.
[3]
Satabdi Basu. 2019. Using rubrics integrating design and coding to assess middle school students’ open-ended block-based programming projects. In Proceedings of the 50th ACM Technical Symposium on Computer Science Education. 1211–1217.
[4]
Satabdi Basu, Daisy Rutstein, Yuning Xu, and Linda Shear. 2020. A principled approach to designing a computational thinking practices assessment for early grades. In Proceedings of the 51st ACM Technical Symposium on Computer Science Education. 912–918.
[5]
Rosemary Pessoa Borges, Pablo Roberto Fernandes Oliveira, Romulo Galdino Rocha Lima, and Rommel Wladimir Lima. 2018. A systematic review of literature on methodologies, practices, and tools for programming teaching. IEEE Latin Amer. Trans. 16, 5 (2018), 1468–1475.
[6]
Jürgen Börstler, Harald Störrle, Daniel Toll, Jelle van Assema, Rodrigo Duran, Sara Hooshangi, Johan Jeuring, Hieke Keuning, Carsten Kleiner, and Bonnie MacKellar. 2018. “I know it when I see it” perceptions of code quality: ITiCSE’17 working group report. In Proceedings of the ITiCSE Conference on Working Group Reports. 70–85.
[7]
Virginia Braun and Victoria Clarke. 2006. Using thematic analysis in psychology. Qualitat. Res. Psychol. 3, 2 (2006), 77–101.
[8]
Karen Brennan and Mitchel Resnick. 2012. New frameworks for studying and assessing the development of computational thinking. In Proceedings of the Annual Meeting of the American Educational Research Association.
[9]
John L. Campbell, Charles Quincy, Jordan Osserman, and Ove K. Pedersen. 2013. Coding in-depth semistructured interviews: Problems of unitization and intercoder reliability and agreement. Sociolog. Meth. Res. 42, 3 (2013), 294–320.
[10]
Mehmet Celepkolu, Erin O’Halloran, and Kristy Elizabeth Boyer. 2020. Upper elementary and middle grade teachers’ perceptions, concerns, and goals for integrating CS into classrooms. In Proceedings of the 51st ACM Technical Symposium on Computer Science Education. 965–970.
[11]
Guanhua Chen, Ji Shen, Lauren Barth-Cohen, Shiyan Jiang, Xiaoting Huang, and Moataz Eltoukhy. 2017. Assessing elementary students’ computational thinking in everyday reasoning and robotics programming. Comput. Educ. 109 (2017), 162–175.
[12]
Alan Cooper et al. 2004. The Inmates Are Running the Asylum: Why High-tech Products Drive Us Crazy and How to Restore the Sanity. Vol. 2. Sams Technical Publishing, Indianapolis, IN.
[13]
Nicole M. Deterding and Mary C. Waters. 2021. Flexible coding of in-depth interviews: A twenty-first-century approach. Sociolog. Meth. Res. 50, 2 (2021), 708–739.
[14]
Yihuan Dong, Veronica Cateté, Nicholas Lytle, Amy Isvik, Tiffany Barnes, Robin Jocius, Jennifer Albert, Deepti Joshi, Richard Robinson, and Ashley Andrews. 2019. Infusing computing: Analyzing teacher programming products in K–12 computational thinking professional development. In Proceedings of the ACM Conference on Innovation and Technology in Computer Science Education. 278–284.
[15]
Svetlana V. Drachova, Jason O. Hallstrom, Joseph E. Hollingsworth, Joan Krone, Rich Pak, and Murali Sitaraman. 2015. Teaching mathematical reasoning principles for software correctness and its assessment. ACM Trans. Comput. Educ. 15, 3 (2015), 1–22.
[16]
Katrina Falkner, Sue Sentance, Rebecca Vivian, Sarah Barksdale, Leonard Busuttil, Elizabeth Cole, Christine Liebe, Francesco Maiorana, Monica M. McGill, and Keith Quille. 2019. An international comparison of K–12 computer science education intended and enacted curricula. In Proceedings of the 19th Koli Calling International Conference on Computing Education Research. 1–10.
[17]
Patricia I. Fusch and Lawrence R. Ness. 2015. Are we there yet? Data saturation in qualitative research. Qualitat. Rep. 20, 9 (2015), 1408.
[18]
Gerald Futschek. 2006. Algorithmic thinking: The key for understanding computer science. In Proceedings of the International Conference on Informatics in Secondary Schools—Evolution and Perspectives. Springer, 159–168.
[19]
Shuchi Grover and Satabdi Basu. 2017. Measuring student learning in introductory block-based programming: Examining misconceptions of loops, variables, and boolean logic. In Proceedings of the ACM SIGCSE Technical Symposium on Computer Science Education. 267–272.
[20]
Thomas R. Guskey and Laura J. Link. 2019. Exploring the factors teachers consider in determining students’ grades. Assessment in Education: Principles, Policy & Practice 26, 3 (2019), 303–320.
[21]
Emily Hestness, Diane Jass Ketelhut, J. Randy McGinnis, Jandelyn Plane, Bonnie Razler, Kelly Mills, Lautaro Cabrera, and Elias Gonzalez. 2018. Computational thinking professional development for elementary science educators: Examining the design process. In Proceedings of the Society for Information Technology & Teacher Education International Conference. Association for the Advancement of Computing in Education (AACE), 1904–1912.
[22]
Ken Kahn and Harriette L. Spiegel. 1999. The role of computer programming in education. J. Educ. Technol. Societ. 2, 4 (1999), 6–9.
[23]
Hanna Kallio, Anna-Maija Pietilä, Martin Johnson, and Mari Kangasniemi. 2016. Systematic methodological review: Developing a framework for a qualitative semi-structured interview guide. J. Adv. Nurs. 72, 12 (2016), 2954–2965.
[24]
ChanMin Kim, Jiangmei Yuan, Lucas Vasconcelos, Minyoung Shin, and Roger B. Hill. 2018. Debugging during block-based programming. Instruct. Sci. 46, 5 (2018), 767–787.
[25]
Diana Kirk, Ewan Tempero, Andrew Luxton-Reilly, and Tyne Crow. 2020. High school teachers’ understanding of code style. In Proceedings of the 20th Koli Calling International Conference on Computing Education Research. 1–10.
[26]
Özgen Korkmaz, Recep Çakir, and M. Yaşar Özden. 2017. A validity and reliability study of the computational thinking scales (CTS). Comput. Hum. Behav. 72 (2017), 558–569.
[27]
Michael J. Lee, Faezeh Bahmani, Irwin Kwan, Jilian LaFerte, Polina Charters, Amber Horvath, Fanny Luor, Jill Cao, Catherine Law, Michael Beswetherick et al. 2014. Principles of a debugging-first puzzle game for computing education. In Proceedings of the IEEE Symposium on Visual Languages and Human-centric Computing (VL/HCC). IEEE, 57–64.
[28]
Linda Mannila, Fredrik Heintz, Susanne Kjällander, and Anna Åkerfeldt. 2020. Programming in primary education: Towards a research based assessment framework. In Proceedings of the 15th Workshop on Primary and Secondary Computing Education. 1–10.
[29]
Jesús Moreno-León, Gregorio Robles, and Marcos Román-González. 2015. Dr. Scratch: Automatic analysis of Scratch projects to assess and foster computational thinking. RED. Revista de Educación a Distancia46 (2015), 1–23.
[30]
Laurie Murphy, Gary Lewandowski, Renée McCauley, Beth Simon, Lynda Thomas, and Carol Zander. 2008. Debugging: The good, the bad, and the quirky—A qualitative analysis of novices’ strategies. ACM SIGCSE Bull. 40, 1 (2008), 163–167.
[31]
Guri A. Nortvedt and Nils Buchholtz. 2018. Assessment in mathematics education: Responding to issues regarding methodology, policy, and equity. ZDM 50, 4 (2018), 555–570.
[32]
Joel D. Olson, Chad McAllister, Lynn D. Grinnell, Kimberly Gehrke Walters, and Frank Appunn. 2016. Applying constant comparative method with multiple investigators and inter-coder reliability. Qualitat. Rep. 21, 1 (2016).
[33]
Marcos Román-González, Juan-Carlos Pérez-González, and Carmen Jiménez-Fernández. 2017. Which cognitive abilities underlie computational thinking? Criterion validity of the computational thinking test. Comput. Hum. Behav. 72 (2017), 678–691.
[34]
Mark A. Runco and Selcuk Acar. 2012. Divergent thinking as an indicator of creative potential. Creativ. Res. J. 24, 1 (2012), 66–75.
[35]
Mark A. Runco and Garrett J. Jaeger. 2012. The standard definition of creativity. Creativ. Res. J. 24, 1 (2012), 92–96.
[36]
Sameh Said-Metwaly, Wim Van den Noortgate, and Eva Kyndt. 2017. Approaches to measuring creativity: A systematic literature review. Creativ. Theor.–Res.-Applic. 4, 2 (2017), 238–275.
[37]
Ronny Scherer, Fazilat Siddiq, and Bárbara Sánchez Viveros. 2019. The cognitive benefits of learning computer programming: A meta-analysis of transfer effects. J. Educ. Psychol. 111, 5 (2019), 764.
[38]
Young-Ho Seo and Jong-Hoon Kim. 2016. Analyzing the effects of coding education through pair programming for the computational thinking and creativity of elementary school students. Ind. J. Sci. Technol. 9, 46 (2016), 1–5.
[39]
Jocelyn Simmonds, Francisco J. Gutierrez, Cecilia Casanova, Cecilia Sotomayor, and Nancy Hitschfeld. 2019. A teacher workshop for introducing computational thinking in rural and vulnerable environments. In Proceedings of the 50th ACM Technical Symposium on Computer Science Education. 1143–1149.
[40]
Martijn Stegeman, Erik Barendsen, and Sjaak Smetsers. 2014. Towards an empirically validated model for assessment of code quality. In Proceedings of the 14th Koli Calling International Conference on Computing Education Research. 99–108.
[41]
Martijn Stegeman, Erik Barendsen, and Sjaak Smetsers. 2016. Designing a rubric for feedback on code quality in programming courses. In Proceedings of the 16th Koli Calling International Conference on Computing Education Research. 160–164.
[42]
Xiaodan Tang, Yue Yin, Qiao Lin, Roxana Hadad, and Xiaoming Zhai. 2020. Assessing computational thinking: A systematic review of empirical studies. Comput. Educ. 148 (2020), 103798.
[43]
Donald J. Treffinger, Grover C. Young, Edwin C. Selby, and Cindy Shepardson. 2002. Assessing creativity: A guide for educators. Nat. Res. Cent. Gift. Talent. (2002). https://eric.ed.gov/?id=ED505548.
[44]
Onderwijs Vlaanderen. 1999. Onderwijsdoelen - Resultaten. Retrieved from https://onderwijsdoelen.be/uitgangspunten/4814.
[45]
Anna Yström, Lena Peterson, Björn von Sydow, and Johan Malmqvist. 2010. Using personas to guide education needs analysis and program design. In Proceedings of the 6th International CDIO Conference.

Index Terms

  1. Assessment of Code, Which Aspects Do Teachers Consider and How Are They Valued?

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Transactions on Computing Education
    ACM Transactions on Computing Education  Volume 22, Issue 4
    December 2022
    384 pages
    EISSN:1946-6226
    DOI:10.1145/3561990
    • Editor:
    • Amy J. Ko
    Issue’s Table of Contents

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 15 September 2022
    Online AM: 24 March 2022
    Accepted: 05 February 2022
    Revised: 30 November 2021
    Received: 18 August 2021
    Published in TOCE Volume 22, Issue 4

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. K12
    2. programming
    3. assessment
    4. thematic analysis
    5. teachers

    Qualifiers

    • Research-article
    • Refereed

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 492
      Total Downloads
    • Downloads (Last 12 months)137
    • Downloads (Last 6 weeks)15
    Reflects downloads up to 26 Dec 2024

    Other Metrics

    Citations

    View Options

    Login options

    Full Access

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Full Text

    View this article in Full Text.

    Full Text

    HTML Format

    View this article in HTML Format.

    HTML Format

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media