Ayuda
Ir al contenido

Dialnet


Resumen de Using Item Response Theory To Assess Changes in Student Performance Based on Changes in Question Wording

Kimberly D. Schurmeier, Charles H. Atwood, Carrie G. Shepler, Gary J. Lautenschlager

  • Five years of longitudinal data for general chemistry student assessments at the University of Georgia have been analyzed using item response theory (IRT). Our analysis indicates that minor changes in question wording on exams can make significant differences in student performance on assessment questions. This analysis encompasses data from over 6100 students, giving an extremely small statistical uncertainty. IRT provided us with a new insight into student performance on our assessments that is also important to the chemical education community. In this paper, IRT, in conjunction with computerized testing, indicates how nuances in question wording impact student performance on assessments.


Fundación Dialnet

Dialnet Plus

  • Más información sobre Dialnet Plus