Automated Chinese Essay Scoring Based on Multilevel Linguistic Features

Tao Hsing Chang*, Yao Ting Sung

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingChapter

9 Citations (Scopus)

Abstract

Writing assessments make up an important part of the learning process as one masters the important linguistic skill of writing. However, this process has not been implemented effectively or on a large scale because the task of essay scoring is very time-consuming. The solution to this problem is AES, where machines are used to automatically score essays. In fact, the application of AES to English learning has been successful. Due to differences in linguistic characteristics, a redesign is needed before AES can be applied to Chinese learning. The purpose of this chapter is to introduce ACES, an automated system for scoring Chinese essays, and explain the basic framework, design principles, and scoring accuracy of the system. Unlike some end-to-end AES systems, ACES’ basic framework is designed to provide more interpretative features. The experimental results show that the performance of the ACES system is stable and reliable, and on par with other commercial English AES systems.

Original languageEnglish
Title of host publicationChinese Language Learning Sciences
PublisherSpringer Nature
Pages253-269
Number of pages17
DOIs
Publication statusPublished - 2019

Publication series

NameChinese Language Learning Sciences
ISSN (Print)2520-1719
ISSN (Electronic)2520-1727

ASJC Scopus subject areas

  • Language and Linguistics
  • Education
  • Linguistics and Language
  • Computer Science Applications

Fingerprint

Dive into the research topics of 'Automated Chinese Essay Scoring Based on Multilevel Linguistic Features'. Together they form a unique fingerprint.

Cite this