Nothing Special   »   [go: up one dir, main page]

BERT-IRT: Accelerating Item Piloting with BERT Embeddings and Explainable IRT Models

Kevin P. Yancey, Andrew Runge, Geoffrey LaFlair, Phoebe Mulcaire


Abstract
Estimating item parameters (e.g., the difficulty of a question) is an important part of modern high-stakes tests. Conventional methods require lengthy pilots to collect response data from a representative population of test-takers. The need for these pilots limit item bank size and how often those item banks can be refreshed, impacting test security, while increasing costs needed to support the test and taking up the test-taker’s valuable time. Our paper presents a novel explanatory item response theory (IRT) model, BERT-IRT, that has been used on the Duolingo English Test (DET), a high-stakes test of English, to reduce the length of pilots by a factor of 10. Our evaluation shows how the model uses BERT embeddings and engineered NLP features to accelerate item piloting without sacrificing criterion validity or reliability.
Anthology ID:
2024.bea-1.35
Volume:
Proceedings of the 19th Workshop on Innovative Use of NLP for Building Educational Applications (BEA 2024)
Month:
June
Year:
2024
Address:
Mexico City, Mexico
Editors:
Ekaterina Kochmar, Marie Bexte, Jill Burstein, Andrea Horbach, Ronja Laarmann-Quante, Anaïs Tack, Victoria Yaneva, Zheng Yuan
Venue:
BEA
SIG:
SIGEDU
Publisher:
Association for Computational Linguistics
Note:
Pages:
428–438
Language:
URL:
https://aclanthology.org/2024.bea-1.35
DOI:
Bibkey:
Cite (ACL):
Kevin P. Yancey, Andrew Runge, Geoffrey LaFlair, and Phoebe Mulcaire. 2024. BERT-IRT: Accelerating Item Piloting with BERT Embeddings and Explainable IRT Models. In Proceedings of the 19th Workshop on Innovative Use of NLP for Building Educational Applications (BEA 2024), pages 428–438, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
BERT-IRT: Accelerating Item Piloting with BERT Embeddings and Explainable IRT Models (Yancey et al., BEA 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.bea-1.35.pdf