Comparing grammar feedback provided by teachers with an automated writing evaluation system

Hao Jan Howard Chen*, Huei Wen Sarah Cheng, Ting Yu Chirstine Yang

*此作品的通信作者

研究成果: 雜誌貢獻期刊論文同行評審

5 引文 斯高帕斯(Scopus)

摘要

Writing has always been the focus in Taiwanese high school English classrooms. However, with often more than 40 students in a class, grading and giving feedback on students' writing can be a burden. Automated writing evaluation (AWE) systems have recently emerged that provide learners with computerized feedback and serve as an alternative way to help teachers correct essays. This study aims to investigate the employment of a newly-developed AWE system, Correct English, in correcting students' writing and uncover whether the system can provide feedback which can be used by learners to revise their texts. Ninety 12th-graders were recruited, each producing two compositions on assigned topics for the system and two human raters to correct. The computer-generated feedback was compared with that of the human raters. The results showed that one third of the feedback messages provided by Correct English were false alarms, and many common errors were left untreated. Additionally, human raters can provide revisions or suggestions on unclear or ungrammatical sentences that is beyond the capabilities of Correct English. Although Correct English can help with correcting basic errors, there remains much room for improvement. Language learners should therefore use its feedback with caution and seek teachers' help when needed.

原文英語
頁(從 - 到)99-131
頁數33
期刊English Teaching and Learning
41
發行號4
DOIs
出版狀態已發佈 - 2017 12月 1

ASJC Scopus subject areas

  • 教育
  • 語言和語言學

指紋

深入研究「Comparing grammar feedback provided by teachers with an automated writing evaluation system」主題。共同形成了獨特的指紋。

引用此