skip to main content skip to footer

Influence of Selected-Response Format Variants on Test Characteristics and Test-Taking Effort: An Empirical Study PIAAC RTE

Author(s):
Guo, Hongwen; Rios, Joseph A.; Ling, Guangming; Wang, Zhen; Gu, Lin; Yang, Zhitong; Liu, Ou Lydia
Publication Year:
2022
Report Number:
RR-22-01
Source:
ETS Research Report
Document Type:
Report
Page Count:
22
Subject/Key Words:
Selected Response Items, Test Characteristics, Low-Stakes Assessment, Test Engagement, Multiple Choice Items, Programme for International Assessment of Adult Competencies (PIAAC), Technology Enhanced Assessments, Flagging, Rapid Responding, Guessing, Response Time Effort (RTE), Total Score

Abstract

Different variants of the selected-response (SR) item type have been developed for various reasons (i.e., simulating realistic situations, examining critical-thinking and/or problem-solving skills). Generally, the variants of SR item format are more complex than the traditional multiple-choice (MC) items, which may be more challenging to test takers and thus may discourage their test engagement on low-stakes assessments. Low test-taking effort has been shown to distort test scores and thereby diminish score validity. We used data collected from a large-scale assessment to investigate how variants of the SR item format may impact test properties and test engagement. Results show that the studied variants of SR item format were generally harder and more time consuming compared to the traditional MC item format, but they did not show negative impact on test-taking effort. However, item position had a dominating influence on nonresponse rates and rapid-guessing rates in a cumulative fashion, even though the effect sizes were relatively small in the studied data.

Read More