Abstract
This paper attempts to use a delta robot's structure and reliable coordinates to develop a self-learning Chinese calligraphy-writing system that requires precise control. Ideally, to achieve human-like behavior, a delta robot can learn stroke trajectories autonomously and present the stroke beauty of calligraphy characters. Unfortunately, state-of-the-art approaches have not yet considered the presentation of stroke beauty resulting from angles of rotation and tilt of the brush. This paper presents an integrated system consisting of a stroke processing module, a hypothesis generation net (HGN) learning model with self-learning capability, a delta robot, and an image capture module. Our approach utilizes both the stroke trajectories from the stroke processing module and angles information from the HGN learning model to automatically produce five degrees of freedom action instructions. Based on the instructions, the delta robot completes calligraphy writing. Then, the image capture module provides feedback to the writing system for error calculation and coordinate correction. We utilize the mean absolute percentage error to verify the performance of the writing results. A correction algorithm and linear regression were used to improve the error correction results (less than 2% error). After several cycles, the written results approached the target sample finally. Consequently, the written results produced by the delta robot prove that our proposed system with learning ability can write Chinese calligraphy aesthetically.
Original language | English |
---|---|
Pages (from-to) | 25801-25816 |
Number of pages | 16 |
Journal | IEEE Access |
Volume | 11 |
DOIs | |
Publication status | Published - 2023 |
Keywords
- Chinese calligraphy
- hypothesis generation net
- image-to-action translation
- robotic calligraphy system
ASJC Scopus subject areas
- General Computer Science
- General Materials Science
- General Engineering
- Electrical and Electronic Engineering