Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3478432.3499047acmconferencesArticle/Chapter ViewAbstractPublication PagessigcseConference Proceedingsconference-collections
poster
Public Access

How Do You Know if They Don't Know?: The Design of Pre-Tests in Computing Education Research

Published: 03 March 2022 Publication History

Abstract

As computing education expands to ''all'' students, so too must the assessment of computational learning. However, there are many challenges to designing and using computing assessments in a valid and reliable way. This is especially true with respect to pre-tests, or assessments given at the beginning of an intervention, course, or study. For elementary and middle school interventions, it is still likely that many students in any given study sample will have had no prior experience with computing. For high school interventions, students may have a wide range of prior experiences. How do you design or select a pre-test for these situations? In this poster, we discuss the design of pre-tests for the computing education research community. We outline the fundamental principles of pre-tests and the different purposes they serve in research studies. We complement these principles with examples of pre-tests used in current computing education research. This poster aims to provide guidance on how to intentionally develop and use pre-tests to strengthen the validity of our research findings and better inform on student learning outcomes.

References

[1]
Emily Relkin, Laura de Ruiter, and Marina Umaschi Bers. 2020. TechCheck: Development and validation of an unplugged assessment of computational thinking in early childhood education. Journal of Science Education and Technology 29 (2020), 482--498.
[2]
Benjamin Xie, Matthew J Davidson, Min Li, and Andrew J Ko. 2019. An item response theory evaluation of a language-independent CS1 knowledge assessment. In Proceedings of the 50th ACM Technical Symposium on Computer Science Education. 699--705.

Cited By

View all
  • (2023)Improving Grading Outcomes in Software Engineering Projects through Automated Contributions SummariesProceedings of the 45th International Conference on Software Engineering: Software Engineering Education and Training10.1109/ICSE-SEET58685.2023.00030(259-270)Online publication date: 17-May-2023

Index Terms

  1. How Do You Know if They Don't Know?: The Design of Pre-Tests in Computing Education Research

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    SIGCSE 2022: Proceedings of the 53rd ACM Technical Symposium on Computer Science Education V. 2
    March 2022
    254 pages
    ISBN:9781450390712
    DOI:10.1145/3478432
    Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 03 March 2022

    Check for updates

    Author Tags

    1. assessment
    2. measures
    3. pre-test

    Qualifiers

    • Poster

    Funding Sources

    Conference

    SIGCSE 2022
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 1,595 of 4,542 submissions, 35%

    Upcoming Conference

    SIGCSE Virtual 2024
    1st ACM Virtual Global Computing Education Conference
    December 5 - 8, 2024
    Virtual Event , NC , USA

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)47
    • Downloads (Last 6 weeks)13
    Reflects downloads up to 18 Nov 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2023)Improving Grading Outcomes in Software Engineering Projects through Automated Contributions SummariesProceedings of the 45th International Conference on Software Engineering: Software Engineering Education and Training10.1109/ICSE-SEET58685.2023.00030(259-270)Online publication date: 17-May-2023

    View Options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Login options

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media