Integration of Robotic Vision and Automatic Tool Changer Based on Sequential Motion Primitive for Performing Assembly Tasks

Zong Yue Deng*, Li Wei Kang*, Hsin Han Chiang, Hsiao Chi Li

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contribution


Studies on intelligent robotic manipulation systems have typically focused on the programming efficiency, adaptive control of robotic arms, motion planning of robotic arms, and action diversity of grippers. In this study, a decision tree and visual recognition incorporate into a robotic arm to help it learn complex tasks. This study employed a task tree for the automatic planning of complex tasks, in which a decision-making model was used to generate complex task sets from a pre-built motion dataset in real-time performance. Moreover, the model can analyze the rationality of model steps, introduce new tasks, and perform object analysis. This work applies a support vector machine to identify the state of an object. The model selects a suitable gripper with a rapid gripper switch process by considering the characteristics of the targeting object. This study demonstrated the effectiveness of the proposed approach with suitable intelligence through the assembly task.

Original languageEnglish
Title of host publicationIFAC-PapersOnLine
EditorsHideaki Ishii, Yoshio Ebihara, Jun-ichi Imura, Masaki Yamakita
PublisherElsevier B.V.
Number of pages6
ISBN (Electronic)9781713872344
Publication statusPublished - 2023 Jul 1
Event22nd IFAC World Congress - Yokohama, Japan
Duration: 2023 Jul 92023 Jul 14

Publication series

ISSN (Electronic)2405-8963


Conference22nd IFAC World Congress


  • Robotic arm
  • flexible automation
  • robotic vision
  • task tree
  • tool changer

ASJC Scopus subject areas

  • Control and Systems Engineering


Dive into the research topics of 'Integration of Robotic Vision and Automatic Tool Changer Based on Sequential Motion Primitive for Performing Assembly Tasks'. Together they form a unique fingerprint.

Cite this