Item Response Theory Modeling for Examinee-selected Items with Rater Effect

Chen Wei Liu*, Xue Lan Qiu, Wen Chung Wang

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)

Abstract

Some large-scale testing requires examinees to select and answer a fixed number of items from given items (e.g., select one out of the three items). Usually, they are constructed-response items that are marked by human raters. In this examinee-selected item (ESI) design, some examinees may benefit more than others from choosing easier items to answer, and so the missing data induced by the design become missing not at random (MNAR). Although item response theory (IRT) models have recently been developed to account for MNAR data in the ESI design, they do not consider the rater effect; thus, their utility is seriously restricted. In this study, two methods are developed: the first one is a new IRT model to account for both MNAR data and rater severity simultaneously, and the second one adapts conditional maximum likelihood estimation and pairwise estimation methods to the ESI design with the rater effect. A series of simulations was then conducted to compare their performance with those of conventional IRT models that ignored MNAR data or rater severity. The results indicated a good parameter recovery for the new model. The conditional maximum likelihood estimation and pairwise estimation methods were applicable when the Rasch models fit the data, but the conventional IRT models yielded biased parameter estimates. An empirical example was given to illustrate these new initiatives.

Original languageEnglish
Pages (from-to)435-448
Number of pages14
JournalApplied Psychological Measurement
Volume43
Issue number6
DOIs
Publication statusPublished - 2019 Sep 1
Externally publishedYes

Keywords

  • examinee-selected items
  • missing not at random
  • rater severity

ASJC Scopus subject areas

  • Social Sciences (miscellaneous)
  • Psychology (miscellaneous)

Fingerprint

Dive into the research topics of 'Item Response Theory Modeling for Examinee-selected Items with Rater Effect'. Together they form a unique fingerprint.

Cite this