TY - JOUR
T1 - Improving Creativity Performance Assessment
T2 - A Rater Effect Examination with Many Facet Rasch Model
AU - Hung, Su Pin
AU - Chen, Po Hsi
AU - Chen, Hsueh Chih
N1 - Funding Information:
The first version of this article was accepted as a poster section paper at the APA annual meeting. Preparation of this article was supported by Grant Award #100-2511-S-003-062-MY2 from the National Science Committee as well as by ‘‘Aim for the Top University Project’’ of the National Taiwan Normal University and the Ministry of Education, Taiwan, R.O.C. We thank Joseph Lavallee and Scott Sommers for their useful comments on an earlier version of this article.
PY - 2012/10
Y1 - 2012/10
N2 - Product assessment is widely applied in creative studies, typically as an important dependent measure. Within this context, this study had 2 purposes. First, the focus of this research was on methods for investigating possible rater effects, an issue that has not received a great deal of attention in past creativity studies. Second, the substantive question of whether restrictions on materials used and differences in instructions provided would influence outcomes on measures of creativity was considered. The many-facet Rasch model was used to investigate possible sources of rater bias, including the leniency/severity effect, central tendency effect, halo effect and randomness effect. No indications were found that these potential sources of bias strongly influenced the ratings. The result indicated that the examinees could be reliably differentiated in terms of their performance. Analysis of rater-criterion interactions depicted rater behavior more clearly and, it is suggested, can be of use as a tool for rater training in future studies. In terms of the substantive questions posed, 2 × 2 experimental instructions were manipulated and it was found that different instructions did not affect creative performance. The implications of these findings are discussed.
AB - Product assessment is widely applied in creative studies, typically as an important dependent measure. Within this context, this study had 2 purposes. First, the focus of this research was on methods for investigating possible rater effects, an issue that has not received a great deal of attention in past creativity studies. Second, the substantive question of whether restrictions on materials used and differences in instructions provided would influence outcomes on measures of creativity was considered. The many-facet Rasch model was used to investigate possible sources of rater bias, including the leniency/severity effect, central tendency effect, halo effect and randomness effect. No indications were found that these potential sources of bias strongly influenced the ratings. The result indicated that the examinees could be reliably differentiated in terms of their performance. Analysis of rater-criterion interactions depicted rater behavior more clearly and, it is suggested, can be of use as a tool for rater training in future studies. In terms of the substantive questions posed, 2 × 2 experimental instructions were manipulated and it was found that different instructions did not affect creative performance. The implications of these findings are discussed.
UR - http://www.scopus.com/inward/record.url?scp=84869836517&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84869836517&partnerID=8YFLogxK
U2 - 10.1080/10400419.2012.730331
DO - 10.1080/10400419.2012.730331
M3 - Article
AN - SCOPUS:84869836517
SN - 1040-0419
VL - 24
SP - 345
EP - 357
JO - Creativity Research Journal
JF - Creativity Research Journal
IS - 4
ER -