Unifying Differential Item Functioning in Factor Analysis for Categorical Data Under a Discretization of a Normal Variant

Yu Wei Chang, Nan Jung Hsu, Rung-Ching Tsai

Research output: Contribution to journalArticle

Abstract

The multiple-group categorical factor analysis (FA) model and the graded response model (GRM) are commonly used to examine polytomous items for differential item functioning to detect possible measurement bias in educational testing. In this study, the multiple-group categorical factor analysis model (MC-FA) and multiple-group normal-ogive GRM models are unified under the common framework of discretization of a normal variant. We rigorously justify a set of identified parameters and determine possible identifiability constraints necessary to make the parameters just-identified and estimable in the common framework of MC-FA. By doing so, the difference between categorical FA model and normal-ogive GRM is simply the use of two different sets of identifiability constraints, rather than the seeming distinction between categorical FA and GRM. Thus, we compare the performance on DIF assessment between the categorical FA and GRM approaches through simulation studies on the MC-FA models with their corresponding particular sets of identifiability constraints. Our results show that, under the scenarios with varying degrees of DIF for examinees of different ability levels, models with the GRM type of identifiability constraints generally perform better on DIF detection with a higher testing power. General guidelines regarding the choice of just-identified parameterization are also provided for practical use.

Original languageEnglish
Pages (from-to)382-406
Number of pages25
JournalPsychometrika
Volume82
Issue number2
DOIs
Publication statusPublished - 2017 Jun 1

Fingerprint

Differential Item Functioning
Nominal or categorical data
Factor analysis
Factor Analysis
Statistical Factor Analysis
Discretization
Categorical Group
Identifiability
Model
Categorical
Guidelines
Testing
Parameterization

Keywords

  • differential item functioning
  • discretization of a normal variant
  • graded response models
  • identifiability

ASJC Scopus subject areas

  • Psychology(all)
  • Applied Mathematics

Cite this

Unifying Differential Item Functioning in Factor Analysis for Categorical Data Under a Discretization of a Normal Variant. / Chang, Yu Wei; Hsu, Nan Jung; Tsai, Rung-Ching.

In: Psychometrika, Vol. 82, No. 2, 01.06.2017, p. 382-406.

Research output: Contribution to journalArticle

@article{bbf4cf60989043eb99a61caf9c803be5,
title = "Unifying Differential Item Functioning in Factor Analysis for Categorical Data Under a Discretization of a Normal Variant",
abstract = "The multiple-group categorical factor analysis (FA) model and the graded response model (GRM) are commonly used to examine polytomous items for differential item functioning to detect possible measurement bias in educational testing. In this study, the multiple-group categorical factor analysis model (MC-FA) and multiple-group normal-ogive GRM models are unified under the common framework of discretization of a normal variant. We rigorously justify a set of identified parameters and determine possible identifiability constraints necessary to make the parameters just-identified and estimable in the common framework of MC-FA. By doing so, the difference between categorical FA model and normal-ogive GRM is simply the use of two different sets of identifiability constraints, rather than the seeming distinction between categorical FA and GRM. Thus, we compare the performance on DIF assessment between the categorical FA and GRM approaches through simulation studies on the MC-FA models with their corresponding particular sets of identifiability constraints. Our results show that, under the scenarios with varying degrees of DIF for examinees of different ability levels, models with the GRM type of identifiability constraints generally perform better on DIF detection with a higher testing power. General guidelines regarding the choice of just-identified parameterization are also provided for practical use.",
keywords = "differential item functioning, discretization of a normal variant, graded response models, identifiability",
author = "Chang, {Yu Wei} and Hsu, {Nan Jung} and Rung-Ching Tsai",
year = "2017",
month = "6",
day = "1",
doi = "10.1007/s11336-017-9562-0",
language = "English",
volume = "82",
pages = "382--406",
journal = "Psychometrika",
issn = "0033-3123",
publisher = "Springer New York",
number = "2",

}

TY - JOUR

T1 - Unifying Differential Item Functioning in Factor Analysis for Categorical Data Under a Discretization of a Normal Variant

AU - Chang, Yu Wei

AU - Hsu, Nan Jung

AU - Tsai, Rung-Ching

PY - 2017/6/1

Y1 - 2017/6/1

N2 - The multiple-group categorical factor analysis (FA) model and the graded response model (GRM) are commonly used to examine polytomous items for differential item functioning to detect possible measurement bias in educational testing. In this study, the multiple-group categorical factor analysis model (MC-FA) and multiple-group normal-ogive GRM models are unified under the common framework of discretization of a normal variant. We rigorously justify a set of identified parameters and determine possible identifiability constraints necessary to make the parameters just-identified and estimable in the common framework of MC-FA. By doing so, the difference between categorical FA model and normal-ogive GRM is simply the use of two different sets of identifiability constraints, rather than the seeming distinction between categorical FA and GRM. Thus, we compare the performance on DIF assessment between the categorical FA and GRM approaches through simulation studies on the MC-FA models with their corresponding particular sets of identifiability constraints. Our results show that, under the scenarios with varying degrees of DIF for examinees of different ability levels, models with the GRM type of identifiability constraints generally perform better on DIF detection with a higher testing power. General guidelines regarding the choice of just-identified parameterization are also provided for practical use.

AB - The multiple-group categorical factor analysis (FA) model and the graded response model (GRM) are commonly used to examine polytomous items for differential item functioning to detect possible measurement bias in educational testing. In this study, the multiple-group categorical factor analysis model (MC-FA) and multiple-group normal-ogive GRM models are unified under the common framework of discretization of a normal variant. We rigorously justify a set of identified parameters and determine possible identifiability constraints necessary to make the parameters just-identified and estimable in the common framework of MC-FA. By doing so, the difference between categorical FA model and normal-ogive GRM is simply the use of two different sets of identifiability constraints, rather than the seeming distinction between categorical FA and GRM. Thus, we compare the performance on DIF assessment between the categorical FA and GRM approaches through simulation studies on the MC-FA models with their corresponding particular sets of identifiability constraints. Our results show that, under the scenarios with varying degrees of DIF for examinees of different ability levels, models with the GRM type of identifiability constraints generally perform better on DIF detection with a higher testing power. General guidelines regarding the choice of just-identified parameterization are also provided for practical use.

KW - differential item functioning

KW - discretization of a normal variant

KW - graded response models

KW - identifiability

UR - http://www.scopus.com/inward/record.url?scp=85013130674&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85013130674&partnerID=8YFLogxK

U2 - 10.1007/s11336-017-9562-0

DO - 10.1007/s11336-017-9562-0

M3 - Article

AN - SCOPUS:85013130674

VL - 82

SP - 382

EP - 406

JO - Psychometrika

JF - Psychometrika

SN - 0033-3123

IS - 2

ER -