A Regression Discontinuity Design Framework for Controlling Selection Bias in Evaluations of Differential Item Functioning. Academic Article uri icon

abstract

  • Differential item functioning (DIF) is often used to examine validity evidence of alternate form test accommodations. Unfortunately, traditional approaches for evaluating DIF are prone to selection bias. This article proposes a novel DIF framework that capitalizes on regression discontinuity design analysis to control for selection bias. A simulation study was performed to compare the new framework with traditional logistic regression, with respect to Type I error and power rates of the uniform DIF test statistics and bias and root mean square error of the corresponding effect size estimators. The new framework better controlled the Type I error rate and demonstrated minimal bias but suffered from low power and lack of precision. Implications for practice are discussed.

published proceedings

  • Educ Psychol Meas

altmetric score

  • 0.5

author list (cited authors)

  • Koziol, N. A., Goodrich, J. M., & Yoon, H.

citation count

  • 0

complete list of authors

  • Koziol, Natalie A||Goodrich, J Marc||Yoon, HyeonJin

publication date

  • December 2022