Improving Creativity Performance Assessment: A Rater Effect Examination with Many Facet Rasch Model

Research output: Contribution to journalArticle

11 Citations (Scopus)

Abstract

Product assessment is widely applied in creative studies, typically as an important dependent measure. Within this context, this study had 2 purposes. First, the focus of this research was on methods for investigating possible rater effects, an issue that has not received a great deal of attention in past creativity studies. Second, the substantive question of whether restrictions on materials used and differences in instructions provided would influence outcomes on measures of creativity was considered. The many-facet Rasch model was used to investigate possible sources of rater bias, including the leniency/severity effect, central tendency effect, halo effect and randomness effect. No indications were found that these potential sources of bias strongly influenced the ratings. The result indicated that the examinees could be reliably differentiated in terms of their performance. Analysis of rater-criterion interactions depicted rater behavior more clearly and, it is suggested, can be of use as a tool for rater training in future studies. In terms of the substantive questions posed, 2 × 2 experimental instructions were manipulated and it was found that different instructions did not affect creative performance. The implications of these findings are discussed.

Original languageEnglish
Pages (from-to)345-357
Number of pages13
JournalCreativity Research Journal
Volume24
Issue number4
DOIs
Publication statusPublished - 2012 Oct 1

Fingerprint

Creativity
Epidemiologic Effect Modifiers
Outcome Assessment (Health Care)
Research
Performance Assessment
Raters

ASJC Scopus subject areas

  • Visual Arts and Performing Arts
  • Developmental and Educational Psychology
  • Psychology (miscellaneous)

Cite this

Improving Creativity Performance Assessment : A Rater Effect Examination with Many Facet Rasch Model. / Hung, Su Pin; Chen, Po Hsi; Chen, Hsueh Chih.

In: Creativity Research Journal, Vol. 24, No. 4, 01.10.2012, p. 345-357.

Research output: Contribution to journalArticle

@article{cdfa89ef43ca48f48808416fd236f71f,
title = "Improving Creativity Performance Assessment: A Rater Effect Examination with Many Facet Rasch Model",
abstract = "Product assessment is widely applied in creative studies, typically as an important dependent measure. Within this context, this study had 2 purposes. First, the focus of this research was on methods for investigating possible rater effects, an issue that has not received a great deal of attention in past creativity studies. Second, the substantive question of whether restrictions on materials used and differences in instructions provided would influence outcomes on measures of creativity was considered. The many-facet Rasch model was used to investigate possible sources of rater bias, including the leniency/severity effect, central tendency effect, halo effect and randomness effect. No indications were found that these potential sources of bias strongly influenced the ratings. The result indicated that the examinees could be reliably differentiated in terms of their performance. Analysis of rater-criterion interactions depicted rater behavior more clearly and, it is suggested, can be of use as a tool for rater training in future studies. In terms of the substantive questions posed, 2 × 2 experimental instructions were manipulated and it was found that different instructions did not affect creative performance. The implications of these findings are discussed.",
author = "Hung, {Su Pin} and Chen, {Po Hsi} and Chen, {Hsueh Chih}",
year = "2012",
month = "10",
day = "1",
doi = "10.1080/10400419.2012.730331",
language = "English",
volume = "24",
pages = "345--357",
journal = "Creativity Research Journal",
issn = "1040-0419",
publisher = "Routledge",
number = "4",

}

TY - JOUR

T1 - Improving Creativity Performance Assessment

T2 - A Rater Effect Examination with Many Facet Rasch Model

AU - Hung, Su Pin

AU - Chen, Po Hsi

AU - Chen, Hsueh Chih

PY - 2012/10/1

Y1 - 2012/10/1

N2 - Product assessment is widely applied in creative studies, typically as an important dependent measure. Within this context, this study had 2 purposes. First, the focus of this research was on methods for investigating possible rater effects, an issue that has not received a great deal of attention in past creativity studies. Second, the substantive question of whether restrictions on materials used and differences in instructions provided would influence outcomes on measures of creativity was considered. The many-facet Rasch model was used to investigate possible sources of rater bias, including the leniency/severity effect, central tendency effect, halo effect and randomness effect. No indications were found that these potential sources of bias strongly influenced the ratings. The result indicated that the examinees could be reliably differentiated in terms of their performance. Analysis of rater-criterion interactions depicted rater behavior more clearly and, it is suggested, can be of use as a tool for rater training in future studies. In terms of the substantive questions posed, 2 × 2 experimental instructions were manipulated and it was found that different instructions did not affect creative performance. The implications of these findings are discussed.

AB - Product assessment is widely applied in creative studies, typically as an important dependent measure. Within this context, this study had 2 purposes. First, the focus of this research was on methods for investigating possible rater effects, an issue that has not received a great deal of attention in past creativity studies. Second, the substantive question of whether restrictions on materials used and differences in instructions provided would influence outcomes on measures of creativity was considered. The many-facet Rasch model was used to investigate possible sources of rater bias, including the leniency/severity effect, central tendency effect, halo effect and randomness effect. No indications were found that these potential sources of bias strongly influenced the ratings. The result indicated that the examinees could be reliably differentiated in terms of their performance. Analysis of rater-criterion interactions depicted rater behavior more clearly and, it is suggested, can be of use as a tool for rater training in future studies. In terms of the substantive questions posed, 2 × 2 experimental instructions were manipulated and it was found that different instructions did not affect creative performance. The implications of these findings are discussed.

UR - http://www.scopus.com/inward/record.url?scp=84869836517&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84869836517&partnerID=8YFLogxK

U2 - 10.1080/10400419.2012.730331

DO - 10.1080/10400419.2012.730331

M3 - Article

AN - SCOPUS:84869836517

VL - 24

SP - 345

EP - 357

JO - Creativity Research Journal

JF - Creativity Research Journal

SN - 1040-0419

IS - 4

ER -