Abstract
Writing has always been the focus in Taiwanese high school English classrooms. However, with often more than 40 students in a class, grading and giving feedback on students' writing can be a burden. Automated writing evaluation (AWE) systems have recently emerged that provide learners with computerized feedback and serve as an alternative way to help teachers correct essays. This study aims to investigate the employment of a newly-developed AWE system, Correct English, in correcting students' writing and uncover whether the system can provide feedback which can be used by learners to revise their texts. Ninety 12th-graders were recruited, each producing two compositions on assigned topics for the system and two human raters to correct. The computer-generated feedback was compared with that of the human raters. The results showed that one third of the feedback messages provided by Correct English were false alarms, and many common errors were left untreated. Additionally, human raters can provide revisions or suggestions on unclear or ungrammatical sentences that is beyond the capabilities of Correct English. Although Correct English can help with correcting basic errors, there remains much room for improvement. Language learners should therefore use its feedback with caution and seek teachers' help when needed.
| Original language | English |
|---|---|
| Pages (from-to) | 99-131 |
| Number of pages | 33 |
| Journal | English Teaching and Learning |
| Volume | 41 |
| Issue number | 4 |
| DOIs | |
| Publication status | Published - 2017 Dec 1 |
Keywords
- Automated writing evaluation
- Grammatical errors
- Teacher feedback
ASJC Scopus subject areas
- Education
- Linguistics and Language