Improving Creativity Performance Assessment: A Rater Effect Examination with Many Facet Rasch Model

Su Pin Hung*, Po Hsi Chen, Hsueh Chih Chen

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

22 Citations (Scopus)


Product assessment is widely applied in creative studies, typically as an important dependent measure. Within this context, this study had 2 purposes. First, the focus of this research was on methods for investigating possible rater effects, an issue that has not received a great deal of attention in past creativity studies. Second, the substantive question of whether restrictions on materials used and differences in instructions provided would influence outcomes on measures of creativity was considered. The many-facet Rasch model was used to investigate possible sources of rater bias, including the leniency/severity effect, central tendency effect, halo effect and randomness effect. No indications were found that these potential sources of bias strongly influenced the ratings. The result indicated that the examinees could be reliably differentiated in terms of their performance. Analysis of rater-criterion interactions depicted rater behavior more clearly and, it is suggested, can be of use as a tool for rater training in future studies. In terms of the substantive questions posed, 2 × 2 experimental instructions were manipulated and it was found that different instructions did not affect creative performance. The implications of these findings are discussed.

Original languageEnglish
Pages (from-to)345-357
Number of pages13
JournalCreativity Research Journal
Issue number4
Publication statusPublished - 2012 Oct

ASJC Scopus subject areas

  • Visual Arts and Performing Arts
  • Developmental and Educational Psychology
  • Psychology (miscellaneous)


Dive into the research topics of 'Improving Creativity Performance Assessment: A Rater Effect Examination with Many Facet Rasch Model'. Together they form a unique fingerprint.

Cite this