skip to main content skip to footer

Using Existing Data to Inform Development of New Item Types

Author(s):
Guo, Hongwen; Ling, Guangming; Frankel, Lois
Publication Year:
2020
Report Number:
RR-20-01
Source:
ETS Research Report
Document Type:
Report
Page Count:
16
Subject/Key Words:
Test Reliability, Partial Credit Model, Multiple Choice Items, Data Analysis, Item Development, Item Types, Problem Solving, Critical Thinking

Abstract

With advances in technology, researchers and test developers are developing new item types to measure complex skills like problem solving and critical thinking. Analyzing such items is often challenging because of their complicated response patterns, and thus it is important to develop psychometric methods for practitioners and researchers to analyze these new item types. In this study, we describe a generic approach that involves data‐driven analyses and expert feedback from different research areas so that the analysis results can provide valuable information to test developers and researchers on how complex item types contribute to score reliability and validity and on how to make the test more efficient and reliable in measuring complex skills. A real data example was used to illustrate how to identify nonfunctioning options that might be removed from the test and whether partial credit for certain response selections can be considered.

Read More