Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3430895.3460153acmotherconferencesArticle/Chapter ViewAbstractPublication Pagesl-at-sConference Proceedingsconference-collections
Work in Progress

Data-informed Decision-making in TEFA Processes: An Empirical Study of a Process Derived from Peer-Instruction

Published: 08 June 2021 Publication History

Abstract

When formative assessment involves a large number of learners, Technology-Enhanced Formative Assessments are one of the most popular solutions. However, current TEFA processes lack data-informed decision-making. By analyzing a dataset gathered from a formative assessment tool, we provide evidence about how to improve decision-making in processes that ask learners to answer the same question before and after a confrontation with peers. Our results suggest that learners' understanding increases when the proportion of correct answers before the confrontation is close to 50%, or when learners consistently rate peers' rationales. Furthermore, peer ratings are more consistent when learners' confidence degrees are consistent. These results led us to design a decision-making model whose benefits will be studied in future works.

Supplementary Material

MP4 File (Data-informed Decision Making.mp4)
When formative assessment involves a large number of learners, Technology-Enhanced Formative Assessments are one of the most popular solutions. However, current TEFA processes lack data-informed decision-making. By analyzing a dataset gathered from a formative assessment tool, we provide evidence about how to improve decision-making in processes that ask learners to answer the same question before and after a confrontation with peers. Our results suggest that learners' understanding increases when the proportion of correct answers before the confrontation is close to 50%, or when learners consistently rate peers? rationales. Furthermore, peer ratings are more consistent when learners? confidence degrees are consistent. These results led us to design a decision-making model whose benefits will be studied in future works.

References

[1]
Paul Black and Dylan Wiliam. 1998. Assessment and Classroom Learning. Assessment in Education: Principles, Policy & Practice 5, 1 (March 1998), 7--74.
[2]
Elizabeth S Charles, Nathaniel Lasry, Sameer Bhatnagar, Rhys Adams, Kevin Lenton, Yann Brouillette, Michael Dugdale, Chris Whittaker, and Phoebe Jackson. 2019. Harnessing peer instruction in-and out-of class with myDALITE. In Education and Training in Optics and Photonics. Optical Society of America, SPIE, Quebec City, Quebec, Canada, 11143_89.
[3]
Michelene T. H. Chi and Ruth Wylie. 2014. The ICAP Framework: Linking Cognitive Engagement to Active Learning Outcomes. Educational Psychologist 49, 4 (Oct 2014), 219--243.
[4]
Sarah Clark. 2017. Enhancing Active Learning: Assessment of Poll Everywhere in the Classroom. Technical Report. University of Manitoba.
[5]
Catherine H Crouch and Eric Mazur. 2001. Peer instruction: Ten years of experience and results. American journal of physics 69, 9 (2001), 970--977.
[6]
Donald A Curtis, Samuel L Lind, Christy K Boscardin, and Mark Dellinges. 2013. Does student confidence on multiple-choice question assessments provide useful information? Medical education 47, 6 (2013), 578--584.
[7]
Kit S Double, Joshua A McGrane, and Therese N Hopfenbeck. 2020. The impact of peer assessment on academic performance: A meta-analysis of control group studies. (2020).
[8]
Cath Ellis. 2013. Broadening the scope and increasing the usefulness of learning analytics: The case for assessment analytics. British Journal of Educational Technology 44, 4 (2013), 662--664.
[9]
B Everett. 2013. An introduction to latent variable models. Springer Science & Business Media.
[10]
Asghar Ghasemi and Saleh Zahediasl. 2012. Normality tests for statistical analysis: a guide for non-statisticians. International journal of endocrinology and metabolism 10, 2 (2012), 486.
[11]
Nathaniel Lasry, Eric Mazur, and Jessica Watkins. 2008. Peer instruction: From Harvard to the two-year college. American journal of Physics 76, 11 (2008), 1066--1069.
[12]
Eric Mazur and Jessica Watkins. 2010. Just-in-time teaching and peer instruction. In Just-in-time Teaching: Across the Disciplines, Across the Academy. Stylus Publishing, LLC, 22883 Quicksilver Drive, Sterling, Virginia 20166--2102, 39--62.
[13]
Ulf Olsson. 1979. Maximum likelihood estimation of the polychoric correlation coefficient. Psychometrika 44, 4 (1979), 443--460.
[14]
Jean-Francc ois Parmentier. 2018. How to quantify the efficiency of a pedagogical intervention with a single question. Physical Review Physics Education Research 14, 2 (2018), 020116.
[15]
Tiffany Potter, Letitia Englund, James Charbonneau, Mark Thomson MacLean, Jonathan Newell, Ido Roll, and others. 2017. ComPAIR: A new online tool using adaptive comparative judgement to support learning with peer feedback. Teaching & Learning Inquiry 5, 2 (2017), 89--113.
[16]
Franck Silvestre, Philippe Vidal, and Julien Broisin. 2015. Reflexive learning, socio-cognitive conflict and peer-assessment to improve the quality of feedbacks in online tests. In Design for Teaching and Learning in a Networked World. Springer, Toledo, Spain, 339--351.
[17]
J Michael Spector, Dirk Ifenthaler, Demetrios Sampson, Joy Lan Yang, Evode Mukama, Amali Warusavitarana, Kulari Lokuge Dona, Koos Eichhorn, Andrew Fluck, Ronghuai Huang, and others. 2016. Technology enhanced formative assessment for 21st century learning. International Forum of Educational Technology and Society 19, 3 (2016), 58--71.
[18]
Jonathan G. Tullis and Robert L. Goldstone. 2020. Why does peer instruction benefit student learning? Cognitive Research: Principles and Implications 5, 1 (Dec 2020), 15.
[19]
Trisha Vickrey, Kaitlyn Rosploch, Reihaneh Rahmanian, Matthew Pilarz, and Marilyne Stains. 2015. Research-Based Implementation of Peer Instruction: A Literature Review. CBE-Life Sciences Education 14, 1 (Mar 2015), es3.

Index Terms

  1. Data-informed Decision-making in TEFA Processes: An Empirical Study of a Process Derived from Peer-Instruction

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    L@S '21: Proceedings of the Eighth ACM Conference on Learning @ Scale
    June 2021
    380 pages
    ISBN:9781450382151
    DOI:10.1145/3430895
    Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 08 June 2021

    Check for updates

    Author Tags

    1. decision-making
    2. learning analytics
    3. peer instruction
    4. technology-enhanced formative assessment

    Qualifiers

    • Work in progress

    Conference

    L@S '21
    L@S '21: Eighth (2021) ACM Conference on Learning @ Scale
    June 22 - 25, 2021
    Virtual Event, Germany

    Acceptance Rates

    Overall Acceptance Rate 117 of 440 submissions, 27%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 78
      Total Downloads
    • Downloads (Last 12 months)1
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 05 Feb 2025

    Other Metrics

    Citations

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media