skip to main content skip to footer

Investigating Robustness of Item Response Theory Proficiency Estimators to Atypical Response Behaviors Under Two-Stage Multistage Testing IRT MST GRE CAT

Author(s):
Kim, Sooyeon; Moses, Tim P.
Publication Year:
2016
Report Number:
ETS GRE-16-03
Source:
ETS Research Report
Document Type:
Report
Page Count:
25
Subject/Key Words:
Item Response Theory (IRT), Proficiency Measurement, Multistage Testing (MST), Graduate Record Examination (GRE), Bayesian Analysis, Computerized Adaptive Testing (CAT), Error Detection

Abstract

The purpose of this study is to evaluate the extent to which item response theory (IRT) proficiency estimation methods are robust to the presence of aberrant responses under the GRE General Test multistage adaptive testing (MST) design. To that end, a wide range of atypical response behaviors affecting as much as 10% of the test items were simulated using a generic GRE 2-stage MST and the 2-parameter logistic (2PL) IRT model. As expected, some differences were found among the 5 estimators in terms of the recovery of the true theta ability; for example, Bayesian estimators had lower error variance and their estimates were regressed to the mean. Once the IRT theta estimates were scaled onto a comparable reporting score scale, however, it was found that all the estimation methods investigated, including the one currently used to score GRE MSTs, were equally robust under the simulated conditions.

Read More